Posted in | News | Machine-Vision

Scientists Use Machine Learning to Develop Tactile, Vision-Based Sensor

Scientists at ETH Zürich have recently built an innovative yet affordable tactile sensor using machine learning. This sensor quantifies the distribution of force at high resolution and with excellent precision, allowing robot gripper arms to grasp objects that are delicate or sensitive.

The tactile sensor prototype. Image Credit: ETH Zurich.

Humans can easily pick up slippery or delicate objects with their hands. Their sense of touch allows them to feel whether the object is going to slip through their fingers or whether they have grasped the object firmly, and based on this, they can modify the strength of their grip accordingly. This kind of feedback is also needed for the robot gripper arms that are tasked to pick up slippery or fragile objects, or objects that have a complex surface.

At ETH Zurich, robotics scientists have currently created a new tactile sensor that could prove useful in the same, exact case—and according to them, represents an important milestone towards “robotic skin.”

The researchers pointed out that the tactile sensor can be produced cost-effectively, thanks to its extremely simple design. The sensor essentially contains a standard camera attached to the underside and an elastic silicone “skin” with colored plastic microbeads.

Measurements Using Purely Optical Input

The tactile sensor is vision-based—that is, when contact occurs between an object and the sensor, an indentation occurs in the silicone skin. This modifies the microbeads’ pattern, and the fisheye lens provided on the sensor’s underside registers this pattern. Based on such changes to the pattern, the force distribution on the sensor could be calculated.

Conventional sensors register the applied force at only a single point. By contrast, our robotic skin lets us distinguish between several forces acting on the sensor surface, and calculate them with high degrees of resolution and accuracy. We can even determine the direction from which a force is acting.

Carlo Sferrazza, Doctoral Student, ETH Zurich

Sferrazza is part of the team headed by Raffaello D’Andrea, Professor of Dynamic Systems and Control at ETH Zurich.

To put this in simple terms, the scientists can not only detect forces that apply vertical pressure on the sensor but can also identify the laterally acting shear forces.

Data-Driven Development

The engineers utilized a detailed set of experimental data to determine the types of forces that push the microbeads in specific directions—the researchers analyzed a range of different types of contact made with the sensor in tests standardized by machine control. The team was able to accurately control and methodically differ the location of the contact, the size of the object that is making contact, and the force distribution.

Using machine learning, the scientists successfully recorded several thousand examples of contact and accurately correlated them with the changes that occurred in the pattern of the microbeads. To date, the thinnest sensor prototype developed by the scientists has a thickness of about 1.7 cm and spans a measurement surface of 5 cm x 5 cm.

But the scientists are looking for ways to use the same method to obtain sensors that have larger surfaces and are fitted with a number of cameras, and thus identify complex-shaped objects. The team is also planning to render the sensor thinner, and according to them, a sensor with a thickness of just 0.5 cm can be achieved using current technology.

Robotics, Sport, and Virtual Reality

Since the sensor can quantify shear forces and the elastic silicone is non-slip, it is perfect for use in robot gripper arms.

The sensor would recognise when an object threatens to slip out of the arm’s grasp so the robot can adjust its grip strength.

Carlo Sferrazza, Doctoral Student, ETH Zurich

Such a sensor could also be used by scientists to digitally map touches or test materials’ hardness. If the sensor is incorporated into wearables, runners could quantify the force that goes into their shoes at the time of jogging, or cyclists could quantify the amount of force they are exerting on the bike via the pedals.

Such sensors can also offer information that is crucial for developing tactile feedback, for instance, for virtual reality games.

Allowing robots to feel

Video Credit: ETH Zurich.


Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.