AI Could Transform the Way we Understand Emotion

An emotion recognition tool - developed by University of the West of Scotland (UWS) academics - could help people with neurodiverse conditions including autism.

Image Credit: University of the West of Scotland

Traditionally, emotion recognition has been a challenging and complex area of study. However, with recent advancements in vision processing, and low-cost devices, such as wearable electroencephalogram (EEG) and electrocardiogram (ECG) sensors, UWS academics have collaborated to harness the power of these technologies to create artificial intelligence (AI) which can accurately read emotion-related signals from brain and facial analysis.

Professor Naeem Ramzan, Director of the Affective and Human Computing for SMART Environments Research Centre at UWS, said: "Emotions are a fundamental aspect of the human experience, and understanding the signals that trigger different emotions can have a profound impact on various aspects of our lives.

"Our recent study has led to the creation of comprehensive data which can be deployed with wearable technology – using multi-sensors and artificial intelligence – to provide a vital tool for emotion recognition. The data also provides a valuable resource for researchers and industry professionals, enabling them to have a greater understanding of emotional triggers, and providing a reference point which could unlock new possibilities for advancements in health and wellbeing, education and security.”

The system uses a multimodal database, developed by UWS researchers, which consists of signals that were recorded during a study using audio-visual stimuli. Participants in the study were recorded and self-assessed their emotional reaction to each stimuli, in terms of reaction, stimulation, and dominance. Signals were captured using a camera and wearable, wireless equipment that has the potential to allow the use of affective computing methods in everyday applications.

This breakthrough could offer a new tool for clinicians, therapists, and caregivers to better understand the emotional states of individuals with a range of neurodiverse conditions; offering the potential to improve mental health assessments and enable early intervention for emotional challenges, opening up greater possibilities for personalised therapeutic interventions.

The technology could pave the way for the creation of augmented reality, virtual reality, or application in robotics specifically designed to assist individuals by understanding and expressing emotions.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.