Researchers Develop a Perception System for Soft Robots

Motivated by the way humans are able to process information regarding their own bodies in space and with respect to other people and objects, an international research team has created a novel perception system for soft robots.

An international team of researchers has developed a perception system for soft robots inspired by the way humans process information about their own bodies in space and in relation to other objects and people. They describe the system, which includes a motion capture system, soft sensors, a neural network, and a soft robotic finger, in the January 30th, 2019 issue of Science Robotics. (Image credit: University of California San Diego)

The researchers have described their system, which features soft sensors, a motion capture system, a soft robotic finger, and a neural network in the January 30th, 2019 issue of Science Robotics.

The ultimate objective of the researchers is to create a system that can predict the movements as well as the internal state of a robot without depending on external sensors, just like humans do on a daily basis. In the Science Robotics paper, the team has demonstrated that this objective has been achieved for a soft robotic finger. The latest study has important implications in wearable robotics, human-robot interaction, and even soft devices to treat disorders affecting bones and muscles.

The system is intended to imitate the numerous components needed for humans to traverse their environment—the motion capture system is meant for vision; the sensors stand in for touch; the neural network for brain functions; and the finger for the body interacting with the external world. Available to train the neural network, the motion capture system can be discarded after the training is over.

The advantages of our approach are the ability to predict complex motions and forces that the soft robot experiences (which is difficult with traditional methods) and the fact that it can be applied to multiple types of actuators and sensors. Our method also includes redundant sensors, which improves the overall robustness of our predictions.

Michael Tolley, Study Senior Author and Professor, Mechanical and Aerospace Engineering, University of California San Diego.

The scientists randomly integrated soft strain sensors into the soft robotic finger, well aware that they would be sensitive to a wide range of motions, and subsequently applied machine learning methods to infer the signals of the sensors. This enabled the research team, which includes scientists from the Bioinspired Robotics and Design Lab at UC San Diego, to predict the forces exerted to, and movements of, the finger. This method will allow investigators to create models that have the ability to foretell deformations and forces experienced by soft robotic systems while moving.

This is significant because the methods conventionally applied in robotics for processing sensor data are not capable of capturing the intricate deformations of soft systems. Moreover, the data captured by the sensors are also complicated. Therefore, the design, development, and placement of sensors in the soft robots are challenging tasks, but can be considerably enhanced if scientists had access to powerful models. This is exactly what the researchers are hoping to offer.

Upgrading the number of sensors to better imitate the solid sensing capabilities of biological skin and also closing the loop for the actuator’s feedback control are the next steps of the study.

Soft robot perception using embedded soft sensors and recurrent neural networks

Thomas George Thuruthel and Cecilia Laschi, The BioRobotics Institute, Scuola Superiore SantAnna, Pisa, Italy; and Benjamin Shih and Michael Thomas Tolley, Department of Mechanical and Aerospace Engineering, UC San Diego.

Videos available here: http://bit.ly/SciRoJan19video

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.