By Kal Kaur IntroductionFacial ExpressionPersonalityReferences
Artificial intelligence has successfully been applied to revolutionize industries including manufacturing, automation, and agriculture. Modern concepts on robotics are paying more attention to the possibility of integrating robots with society. Though there have been some successful attempts at developing humanoids, being able to incorporate human emotion into a humanoid structure is complex when considering facial expressions and personality.
Facial expressions are paramount to understanding and developing dialog between humans. By incorporating neurophysiological models to an artificial system, there’s a possibility of human–robot interaction in daily life. With over 50% of human–human interaction delivered via facial expressions, this form of communication is therefore important if human–robot interaction is to be introduced into society.
There are many techniques used to generate emotive expression in robotic systems. One technique is the interpolation-based technique which maps emotive expression over a 3D space. The three dimensions are based on arousal, valence, and positioning. When grouped together, all three states will occupy a particular point in a space at any given time (i.e., if a robot changes its state of valence, position, and arousal, this point will move within a space). This space will also map out level of arousal such as excitement and disappointment.
This technique is fully active in real-time, an important factor to consider if trying to achieve social interaction among robots. Posture will also affect emotive expression, with the interpolation technique encompassing nine basis (a posture), which will help adjust a facial expression. It is known that certain postures have more influence on human facial expression and, when considering the current technique, a valence basis strongly manipulates the movement of lips, eyelids, and the jaw – key to developing a dialog in robots. One of the main advantages to this technique is the ability of it to control emotive expression in a continuous space. There is also a smooth trajectory of facial expressions through space using this method. The video below is a great example of how robots are being designed to express emotion.
Incorporating emotive expression algorithms into a robotic structure is only part of the challenge in allowing for human–robot interaction. Many parameter-styles of personality need to be considered when engineering a robot that can converse with humans in a social setting. By mapping personality variation, robot behaviour can be manipulated. There are three factors that will help manipulate the personality of a robot:
- Inherent disposition: refers to how a robot can change its personality profile without this being manipulated by any external factors.
- Manual setting, whereby a control unit can manipulate the robots behaviour.
- Reinforcement from information received (e.g., pushing the robot may result in the robot feeling distressed).
The following video describes a humanoid robot that can emulate 67 facial expressions and stimulates human personality.
There are many methods to design personality for artificial systems but the majority struggle to integrate artistic and technical factors in robot behaviour. After establishing a personality profile which many consist of 300–400 words to reflect the personality of a robot, design rules need to be modelled and implemented with end-users. During the implementation phase, a behavioural type is implemented, evaluated and altered to meet certain requirements. This stage is recorded using virtual 3D simulations to understand robot behaviour in real-time. The video below demonstrates interaction singulation on a physical robot platform.
An advantage to the open platform for personal robotics is the fact that the physical simulation used in this method is similar to behaviour seen on a hardware platform allowing for behaviour to be modified with an end-user.
This article only scratches the surface on emotive expression and how to implement this into an artificial system in order to try and create a sociable robot system. Of course, the field of implementing emotion and personality models into an artificial system is a complex study. There are many concepts - that need to be considered to engineer a humanoid that could one day encompass a fine-tuned dialog and personality to interact with humans - such as:
- Goals and task selection for sociable robots
- How to use emotion to navigate behaviour
- Emotional control among sociable robots
- The study of emotion and personality of sociable robots based on gender differences
- Use of personality and emotion in the process of learning new skills in sociable robots.
- Breazeal, C.L. (2002).Designing Sociable Robots. United States of America: Massachusetts Institute of Technology.
- Fellous, J., Arbib, M.A. (2005). Who Needs Emotions?: The Brain Meets the Robot. New York: Oxford University Press, Inc.
- Fellous, J., Arbib, M.A. (2005). Who Needs Emotions?: The Brain Meets the Robot. New York: Oxford University Press, Inc. Robot Emotion: A functional Perspective (Chapter by Breazeal, C. and Brooks, R).
- Dautenhahn, K., Saunders, J. (2011). New Frontiers in Human-Robot Interaction. The Netherlands: John Benjamins Publishing Co.
- Bianchi, G., Guinot, J., Rzymkowski, C. (2002). RoManSy 14: Theory and Practice of Robots and Manipulators: Proceedings of the Fourteenth CISM-IFToMM Symposium. New York: Springer-Verlag Wien.
- Jacko, J.A. (2009). Human-compter Interaction. Novel Interaction Methods and Techniques: 13th International Conference, HCI International 2009. Germany: Springer-Verlag Berlin Heidelberg.
- Vallverdú, J., Casacuberta, D. (2009). Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence. London, UK: Information Science Reference (An imprint of IGI Global).