Robots will need the ability to sense when objects are slipping out of its grasp if there are to successfully disable a roadside bomb or handle an egg for making an omelet.
However, so far it has been difficult or impossible for a majority of robotic and prosthetic hands to accurately sense the shear forces and vibrations that occur, for instance, when a finger is sliding along a tabletop or when an object starts to fall.
Recently, engineers from the University of Washington and UCLA have created a flexible sensor “skin” that can be stretched across any part of a robot’s body or prosthetic to accurately transmit information about vibration and shear forces that are important to effectively grasping and controlling objects.
The bio-inspired robot sensor skin, illustrated in a paper published in Sensors and Actuators A: Physical, imitates the way a human finger experiences compression and tension as it slides along a surface or differentiates among different textures. It measures this tactile information with similar sensitivity and precision as human skin, and could hugely improve the ability of robots to do everything from industrial and surgical procedures to cleaning a kitchen.
“Robotic and prosthetic hands are really based on visual cues right now — such as, 'Can I see my hand wrapped around this object?' or 'Is it touching this wire?' But that’s obviously incomplete information,” said senior author Jonathan Posner, a UW professor of mechanical engineering and of chemical engineering.
“If a robot is going to dismantle an improvised explosive device, it needs to know whether its hand is sliding along a wire or pulling on it. To hold on to a medical instrument, it needs to know if the object is slipping. This all requires the ability to sense shear force, which no other sensor skin has been able to do well,” Posner said.
A few robots today use completely instrumented fingers, but that sense of “touch” is restricted to that appendage and one cannot change its size or shape to handle various tasks. The other approach is to wrap a robot appendage in a sensor skin, which offers improved design flexibility. But such skins have not yet delivered a complete range of tactile information.
“Traditionally, tactile sensor designs have focused on sensing individual modalities: normal forces, shear forces or vibration exclusively. However, dexterous manipulation is a dynamic process that requires a multimodal approach. The fact that our latest skin prototype incorporates all three modalities creates many new possibilities for machine learning-based approaches for advancing robot capabilities,” said co-author and robotics collaborator Veronica Santos, a UCLA associate professor of mechanical and aerospace engineering.
The new stretchable electronic skin, which was created at the UW’s Washington Nanofabrication Facility, is composed of the same silicone rubber used in swimming goggles. The rubber is embedded with miniature serpentine channels — approximately half the width of a human hair — filled with electrically conductive liquid metal that will not fatigue or crack when the skin is stretched, as solid wires would do.
When the skin is stretched around a robot finger or end effector, these microfluidic channels are strategically positioned on either side of where a human fingernail would be.
As one slides their finger across a surface, one side of their nailbed bulges out while the other side becomes stretched under tension. The same thing takes place with the robot or prosthetic finger — the microfluidic channels on one side of the nailbed compress while the ones on the other side expand.
When the channel geometry varies, so does the quantity of electricity that can flow via them. The research team can measure these variances in electrical resistance and correlate them with the vibrations and shear forces that the robot finger is undergoing.
“It’s really following the cues of human biology,” said lead author Jianzhu Yin, who recently got his doctorate from the UW in mechanical engineering. “Our electronic skin bulges to one side just like the human finger does and the sensors that measure the shear forces are physically located where the nailbed would be, which results in a sensor that performs with similar performance to human fingers.”
Positioning the sensors away from the part of the finger that is most likely to make contact makes it easier to differentiate shear forces from the normal "push" forces that also take place when interacting with an object, which has been tough to achieve with other sensor skin solutions.
The research team from the UW College of Engineering and the UCLA Henry Samueli School of Engineering and Applied Science has shown that the physically robust and chemically resistant sensor skin has a superior level of precision and sensitivity for light touch applications — opening a door, shaking hands, interacting with a phone, handling objects, picking up packages, among others. Latest experiments have revealed that the skin can detect minute vibrations at 800 times per second, better than human fingers.
“By mimicking human physiology in a flexible electronic skin, we have achieved a level of sensitivity and precision that’s consistent with human hands, which is an important breakthrough,” Posner said. “The sense of touch is critical for both prosthetic and robotic applications, and that’s what we’re ultimately creating.”
The National Science Foundation funded the research.