Researchers Test 'Proprioceptive' Feedback with Skin-Stretch Device

US and Italian Researchers working to add “muscle sense” to prosthetic limbs discovered that concrete feedback on the skin allowed blindfolded test subjects to more than double their potential to distinguish the size of objects gripped by a prosthetic hand.

Test subjects were more than twice as likely to correctly discern the size of grasped objects grasped with a prosthetic hand when they received haptic feedback from a simple skin-stretch device on the upper arm. (Photo by Jeff Fitlow/Rice University)

Researchers, from Rice University, the Research Center “E.Piaggio” of the University of Pisa and the Italian Institute of Technology (IIT), will present these findings next month in Germany.

Humans have an innate sense of how the parts of their bodies are positioned, even if they can’t see them. This ‘muscle sense’ is what allows people to type on a keyboard, hold a cup, throw a ball, use a brake pedal and do countless other daily tasks.

Marcia O’Malley, Professor of Mechanical Engineering, Rice University

Proprioception is the scientific term for this muscle sense. For years, O’Malley’s Mechatronics and Haptic Interfaces Lab (MAHI) has worked towards developing technology that would permit amputees to get proprioceptive feedback from artificial limbs.

In the paper to be presented on June 7th at the World Haptics 2017 conference in Fürstenfeldbruck, O’Malley and colleagues will be demonstrating that 18 able-bodied test subjects significantly performed in a better manner on size-discrimination tests with a prosthetic hand when haptic feedback was given to them from a simple skin-stretch device on the upper arm. This study is indeed the very first to test a prosthesis together with a skin-stretch rocking device for proprioception, and this research has been considered to be a finalist for best paper award at the conference.

In the U.S. almost 1.7 million people live with the loss of a limb. Standard prostheses restore a few day-to-day functions, but sensory feedback is provided by only very few prostheses. For the most part, an amputee today must see their prosthesis in order to properly operate it.

Inexpensive sensors, advanced computer processors, vibrating motors from cellphones and various other electronics have developed new possibilities for adding tactile feedback, also called haptics, to prosthetics. For more than a decade, O’Malley’s lab has carried out research in this area.

We’ve been limited to testing haptic feedback with simple grippers or virtual environments that replicate what amputees experience. That changed when I was contacted last year by representatives of Antonio Bicchi’s research group at Pisa and IIT who were interested in testing their prosthetic hand with our haptic feedback system.

Marcia O’Malley, Professor of Mechanical Engineering, Rice University

Beginning late last year, in experiments conducted at Rice, MAHI’s Rice Haptic Rocker was tested in conjunction with the Pisa/IIT SoftHand by Pisan Graduate Student Edoardo Battaglia and Rice Graduate Student Janelle Clark. They measured the efficiency of blindfolded subjects to differentiate the size of grasped objects both with and without proprioceptive feedback.

Although a few proprioceptive technologies need surgically implanted electrodes, the Rice Haptic Rocker has a noninvasive, simple user interface — a rotating arm capable of brushing a soft rubber pad over the skin of the arm. The arm rotates as the hand closes and the skin stretches to a greater extent based on how much the hand is able to close.

We’re using the tactile sensation on the skin as a replacement for information the brain would normally get from the muscles about hand position. We’re essentially mapping from feedback from one source onto an aspect of the prosthetic hand. In this case, it’s how much the hand is open or closed.

Janelle Clark, Graduate Student, Rice University

The SoftHand uses a simple design just like the Rice Haptic Rocker. Co-creator Manuel Catalano, a Postdoctoral Research Scientist at IIT/Pisa, stated that the design inspiration was obtained from Neuroscience.

“Human hands have many joints and articulations, and reproducing and controlling that in a robotic hand is very difficult,” he said. “When you have to grasp something, your brain doesn’t program the movement of each finger. Your brain has patterns, called synergies, that coordinate all the joints (in the hand).”

Catalano stated that the Pisa/IIT SoftHand employs a control synergy just like people do in daily life. “At the same time, thanks to the intrinsic capability of the SoftHand to adapt and deform with the environment, it is robust and able to grasp objects in many different ways.”

Battaglia pointed out that a set of synergies for the hand have been identified by neurological studies. These are used by individuals in combination or alone in order to carry out simple tasks such as turning a doorknob and also difficult tasks like playing the piano. The simplest task is grasping an object, like a coat hanger or a cup.

Experiments show that one synergy explains more than 50 percent of all grasps. SoftHand is designed to mimic this. It’s very simple. There is just one motor and one control wire to open and close all the fingers at once.

Edoardo Battaglia, Graduate Student, University of Pisa

In tests, the SoftHand was used by subjects to clutch objects of different sizes and shapes, ranging from grapefruit-sized balls to coins (quarters). The subjects closed the hand by just flexing a muscle in their forearm. The electric signals from the flexing muscle were picked up by electrodes taped to the arm and these signals were then transmitted to the motor in the SoftHand.

Subjects were blindfolded for the size-discrimination test and they were then instructed to clutch two different objects. They then had to judge which of the two was larger. The blindfolded subjects had to base their guesses on intuition without haptic feedback. Only 33% of the time did they choose correctly, and this indeed is what one would expect from a random choice.

When the same tests were carried out with feedback from the Rice Haptic Rocker, the subjects correctly differentiated the larger from smaller objects more than 70% of the time.

The Researchers are following up to see if the amputees get a similar benefit from using the haptic rocker in conjunction with the SoftHand.

“One of the things that makes the research we do in the MAHI lab unique is that we involve end-users from the very beginning, from the design and concept stage all the way to testing and evaluation of our systems,” O’Malley said. “Through our close collaborations in the Texas Medical Center, we are able to have those interactions with the end users — with patients, physical therapists and doctors — all of the way through our design and evaluation process.”

Additional co-authors include Matteo Bianchi of the University of Pisa. The National Science Foundation and the European projects WEARHAP, SOFTPRO and SoftHands, supported the research. The partnership will continue to be supported by a recently announced Rice University Award for International Collaboration, permitting Janelle Clark to visit the University of Pisa for an extended period in 2018.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.