Editorial Feature

Synthetic Emotions in Humanoids – A Brief Overview

Image Credits: sdecoret/shutterstock.com

Artificial intelligence has successfully been applied to revolutionize industries including manufacturing, automation, and agriculture. Modern concepts on robotics are paying more attention to the possibility of integrating robots with society.

Though there have been some successful attempts at developing humanoids, being able to incorporate human emotion into a humanoid structure is complex when considering facial expressions and personality.

Facial Expressions

Facial expressions are paramount to understanding and developing a dialog between humans. By incorporating neurophysiological models to an artificial system, there’s a possibility of human-robot interaction in daily life. With over 50% of human-human interaction delivered via facial expressions, this form of communication is therefore vital if human-robot interaction is to be introduced into society.

There are many techniques used to generate emotive expression in robotic systems. One method is the interpolation-based technique that maps emotive expression over a 3D space. The three dimensions are based on arousal, valence, and positioning. When grouped, all three states will occupy a particular point in a space at any given time. For example, if a robot changes its state of valence, position, and arousal, this point will move within a space. This space will also map out the level of arousals, such as excitement and disappointment.

This technique is fully active in real-time, so is an essential factor to consider if trying to achieve social interaction among robots. Posture will also affect emotive expression, with the interpolation technique encompassing nine postures, which will help adjust a facial expression. It is known that certain postures have more influence on human facial expression and when considering the current technique, a valence basis strongly manipulates the movement of lips, eyelids, and the jaw, which is key to developing a dialog in robots.

One of the main advantages of this technique is its ability to control emotive expression in a continuous space. There is also a smooth trajectory of facial expressions through space using this method. The video below is an example of how robots are being designed to express emotion.

Robots "Express Themselves"

Personality

Incorporating emotive expression algorithms into a robotic structure is only part of the challenge in allowing for human-robot interaction. Many parameter-styles of personality need to be considered when engineering a robot that can converse with humans in a social setting. By mapping personality variation, robot behavior can be manipulated.

Three factors will help manipulate the personality of a robot:

  • Inherent disposition - This refers to how a robot can change its personality profile without this being manipulated by any external factors.
  • Manual setting - With manual settings, a control unit can manipulate the robot’s behavior.
  • Reinforcement - Reinforcement from information received, for instance pushing the robot, may result in the robot feeling distressed.

The following video describes a humanoid robot that can emulate 67 facial expressions and stimulates human personality.
 

Humanizing Robotics


There are many methods to design personality for artificial systems, but the majority struggle to integrate artistic and technical factors in robot behavior.

Different ranges of emotions can be mathematically or computationally modeled for robots, and emotions include:

  • Fear
  • Anger
  • Sadness
  • Disgust
  • Surprise.

Some models can incorporate exponential decay of reinforcements, so if an emotion is not regularly reinforced, it will return to its base state value. Different temperaments can also be modeled, including extrovert/introvert, neurotic/rational, conscientious/careless, agreeable/disagreeable, and open/reticent.

After establishing a personality profile that may consist of 300 to 400 words to reflect the personality of a robot, design rules need to be modeled and implemented with end-users. During the implementation phase, a behavioral type is implemented, evaluated and altered to meet specific requirements. This stage is recorded using virtual 3D simulations to understand robot behavior in real-time. The video below demonstrates interaction singulation on a physical robot platform.

Interaction singulation of objects from a pile

An advantage to the open platform for personal robotics is the fact that the physical simulation used in this method is similar to behavior seen on a hardware platform allowing for behavior to be modified with an end-user.

This article only scratches the surface on emotive expression and how to implement this into an artificial system to try and create a sociable robot system. Of course, the field of implementing emotion and personality models into an artificial system is a complex study. Many concepts need to be considered to engineer a humanoid that could one day encompass a fine-tuned dialog and personality to interact with humans, such as:

  • Goals and task selection for sociable robots
  • How to use emotion to navigate behavior
  • Emotional control among sociable robots
  • The study of emotion and personality of sociable robots based on gender differences
  • Use of personality and emotion in the process of learning new skills in sociable robots.

Further Applications for Emotional Robots

Robotics and artificial intelligence are quickly evolving fields. Robots capable of processing, showing and understanding human emotions could be used in a number of different areas for both research and practical application. Emotional robots could be useful for testing psychological and biological theories, and their uses are beginning to extend into medical fields as well. Sociable robots are being used to battle depression and social isolation in people living with dementia, with the additional benefit of being able to remind patients when to take medication and help nurses care for elderly patients.

Sources and Further Reading

  • Breazeal, C.L. (2002).Designing Sociable Robots. United States of America: Massachusetts Institute of Technology.
  • Fellous, J., Arbib, M.A. (2005). Who Needs Emotions?: The Brain Meets the Robot. New York: Oxford University Press, Inc.
  • Fellous, J., Arbib, M.A. (2005). Who Needs Emotions?: The Brain Meets the Robot. New York: Oxford University Press, Inc. Robot Emotion: A Functional Perspective (Chapter by Breazeal, C. and Brooks, R).
  • Dautenhahn, K., Saunders, J. (2011). New Frontiers in Human-Robot Interaction. The Netherlands: John Benjamins Publishing Co.
  • Bianchi, G., Guinot, J., Rzymkowski, C. (2002). RoManSy 14: Theory and Practice of Robots and Manipulators: Proceedings of the Fourteenth CISM-IFToMM Symposium. New York: Springer-Verlag Wien.
  • Jacko, J.A. (2009). Human-Computer Interaction. Novel Interaction Methods and Techniques: 13th International Conference, HCI International 2009. Germany: Springer-Verlag Berlin Heidelberg.
  • Vallverdú, J., Casacuberta, D. (2009). Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence. London, UK: Information Science Reference (An imprint of IGI Global).

This article was updated on 7th February, 2019.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.