Editorial Feature

Human Robots and Social Interaction

Human-robot social interaction plays an essential role in extending the use of robots in daily life. Robots will be able to perform a lot of additional tasks if they possess the gift of social interaction. These robots can be used to help the aged, work with children having autism and patients finding physically or mentally difficult to perform tasks.

Rather than mechanically performing tasks, if robots can respond intelligently, use proper body language, smile and react to a joke, accepting robots as part of our daily lives will also become quite easy.

Research

Mataric, a director at the USC’s Center for Robotics and Embedded Systems, has successfully developed robots to be used in several therapeutic roles.

He feels that it is important that a successfully designed robot communicates not only verbally but also physically through body language and facial expressions.

It was discovered that when the robot personality was matched to that of the user, people did their rehabilitation exercises for a longer period of time and reported that they enjoyed it more.

He also stated that it is important to match the appearance of the robot to how we perceive its abilities. Researchers found that it is not just enough if robots are made like humans, they must also behave like humans for better acceptance. It is also essential that a social robot learns socially.

Andrea Thomaz, assistant professor at the Georgia Institute of Technology and director of its Social Intelligent Machines Laboratory, has built a robot developed to learn from humans the way a person will, through demonstration, observation and social interaction along with speech.

Researchers normally work with a supervised learning domain in which the robot learns from demonstrations or examples of what to do provided by a human partner.

Andrea Thomaz: Teaching Robots to Move Like Humans

Understanding how to teach robots to move like humans - a discussion by Andrea Thomaz, assistant professor in the School of Interactive Computing at Georgia Tech's College of Computing and Ph.D. student Michael Gielniak.

Simon is a humanoid-like upper torso robot with two hands and a socially expressive head designed by the team at Georgia Institute headed by Andrea.

The robot looks and weighs very similar to a human so that people can work with the robot without feeling strange. Serial elastic actuators are used as motors rendering the movements to be more flexible. Simon’s actuators being compliant may mean that it cannot be very precisely controlled; however, it enables a safer interaction for humans.

2012 Robotics Open House

Demonstration of Simon the learning robot. Georgia Institute of Technology.

This flexibility enabled the robot to receive an object by sensing something being pressed onto its hand. Similarly, it can release an object when it senses that someone is pulling it. Simon has a socially expressive head with eyes, eyelids and a neck.

Computational models are being developed for appropriate eye gaze behaviors to suit the situation. The robot has two articulated ears with RGB LED panels that help them display light in any color. The ears can communicate emotional expression and other non-verbal gestures such as confusion, interest or surprise.

A demonstration was done recently at CHI 2010 conference and Simon’s social attention capabilities were shown. The robot has learned to look around in a socially appropriate way.

Presently, Simon is capable of perceiving auditory and visual movements and also determines what is special in the environment. If the robot hears a loud sound, it turns in that direction and then turns back.

Interactive task learning is another attribute of Simon. He learns how to clean a workspace, learning which object to place where.

Simon is handed an object by a human partner and indicates where he should put it. Simon is encouraged to ask simple questions, if needed. Through a few examples, Simon learns and the next time can easily sort out objects into new locations. Simon is also being trained to understand non-verbal gestures.

A newly released robot, Bandit may help children having autism and help them understand emotional behavior and social cues better. Researchers at the Robotic Research Lab at the University of Southern California conducted studies for kids having autism to play and interact with Bandit.

This technology is a tiny human- like robot having movable mouth and eyebrows and motion sensors that enable him to move forward or back away. The design team aimed to create a balance between robot and human so that he can be approached and engaged without being intimidating or too realistic.

There was anecdotal evidence that kids having autism favorably respond to robots and exhibit social behaviors they do not exhibit with unfamiliar people. Researchers desired to use human-like child-size robots that will serve as peers not toys while interacting with children.

In initial pilot experiments with the robots, Mataric and colleagues found that kids having autism showed unexpected social behaviors including initiating play, pointing, imitating the robot and showing empathy. Bandit looks quite simple with obvious emotional expressions.

Applications for Social Robots

The possible applications for social and emotional robots are:

  • They can be used as conversational agents in education such as in virtual learning situations where ECAs (embodied conversational agents) communicate the use of educational software to pupils in a user-friendly manner.
  • Communicating with children diagnosed with autism.
  • In advisory services to make information systems more accessible to users when conversational agents assist them while dealing with specific systems.
  • As social caretakers, helping the aged and patients finding it physically and mentally difficult to perform tasks.

The objective is to create robotic systems that can help people living alone and who need assistance for carrying out simple household tasks and call for external assistance when needed. These robots can also be helpful as toys for kids.

Future Developments

Four rosy-cheeked social robots, Sophie, Charles, Jack, and Matilda are being used in a trial to improve the quality of life for patients having mild dementia. The social robot project is a joint research venture between NEC and La Trobe University.

It is anticipated that these assistive robots will help mild dementia sufferers by engaging them and through sensory enrichment.

The robots can sing, dance, talk, play games, read the newspaper and tell the weather. These robots are truly unique.

These robots offer novel services such as reminiscing with dementia sufferers, sending mood-based emails, and supporting care-givers to manage remotely the activities of dementia sufferers.

Senior citizens with mild dementia can communicate with social robots using a touch or voice panel with large buttons. The touch panel enables remote communication with the robot at home.

We are not a long way off from the days when social robots will be integrated into the social care system. At present, social robots are in a phase of being trialed, though further developments into communication and exchange of emotions between robots and humans will help define a detailed platform for its application into society.  

Sources and Further Reading

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Kaur, Kalwinder. (2020, August 17). Human Robots and Social Interaction. AZoRobotics. Retrieved on April 25, 2024 from https://www.azorobotics.com/Article.aspx?ArticleID=104.

  • MLA

    Kaur, Kalwinder. "Human Robots and Social Interaction". AZoRobotics. 25 April 2024. <https://www.azorobotics.com/Article.aspx?ArticleID=104>.

  • Chicago

    Kaur, Kalwinder. "Human Robots and Social Interaction". AZoRobotics. https://www.azorobotics.com/Article.aspx?ArticleID=104. (accessed April 25, 2024).

  • Harvard

    Kaur, Kalwinder. 2020. Human Robots and Social Interaction. AZoRobotics, viewed 25 April 2024, https://www.azorobotics.com/Article.aspx?ArticleID=104.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.