This Robot Finger Changes How Machines Interact With the World

A new robotic finger can sense what it’s holding using just a tiny built-in camera, replacing traditional force sensors with smart vision. This approach could make future robots safer, cheaper, and better at handling delicate items like fruit or small parts.

Robot hand holding a banana.

Study: Affordable 3D-printed miniature robotic gripper with integrated camera for vision-based force and torque sensing. Image Credit: RaffMaster/Shutterstock.com

In a recent Nature article, researchers introduced a compliant robotic gripper that estimates gripping force and torque by tracking how its own fingers deform, all via an integrated miniature camera.

The design centers on a low-cost, disposable 3D-printed finger and a streamlined calibration process that requires minimal data. Initial lab tests showed accurate force sensing, with strong potential for manipulating both rigid and fragile objects using visual feedback and force control.

Background

As robotics shifts toward small-batch and flexible production, there’s growing demand for grippers that can reliably handle a wide range of object types, including fragile or irregular ones.

Traditional rigid grippers, built for repetitive industrial tasks, often fall short. They’re difficult to miniaturize, risk damaging delicate items, and lack the adaptability needed for more varied tasks.

Compliant grippers, made from flexible materials, offer a more adaptable alternative. They can naturally conform to different shapes and, importantly, estimate the forces they apply by measuring their own deformation. While some prior systems used cameras and deep learning to interpret this deformation, they typically required large datasets and bulky hardware, including externally mounted cameras. These trade-offs have limited their practicality.

The new approach, called Seezer, takes a different route.

It features a monolithic 3D-printed finger with an embedded miniature camera and uses a force estimation method that requires only a small amount of training data. The result is a compact, lightweight, and more accessible solution for force-sensitive robotic manipulation.

System Design and Methods

Seezer is built around a modular, single-piece compliant finger, 3D-printed with a built-in gear segment and flexible joint (an “x-joint”) for actuation. Each fingertip has two fiducial markers, and all fingers are driven in sync by a single stepper motor via a worm gear.

A miniature camera, mounted at the gripper’s base, faces the fingertip markers to monitor their movement during gripping.

A key mechanical feature is a tension-based coupling that lets the same motor both actuate the fingers and easily attach or detach the disposable finger modules.

The system estimates force and torque in three vision-based steps:

  1. Marker Tracking: It continuously tracks the 2D positions and sizes of six fiducial markers using the integrated camera feed.
  2. Finger Force Estimation: A pre-calibrated linear model maps deviations between expected and observed marker positions (based on motor angle) to a 3D force vector at each fingertip.
  3. Gripper Force/Torque Estimation: A physics-based model then combines the fingertip forces into a full six-axis force/torque estimate, including the net gripping force.

Calibration was done with a custom test rig where Seezer performed controlled motions against a high-precision force/torque sensor. Marker data and motor angles were used to fit the finger-specific linear models. These models were then validated in follow-up experiments.

To explore practical use cases, the researchers tested Seezer in two tasks: picking and placing small gears, and gently harvesting soft redcurrants, both of which used its internal vision system for force feedback.

Results and Discussion

In lab tests, Seezer delivered accurate and efficient force estimation. For the “coarse” finger version, average gripping force errors ranged from 0.09 to 0.19 newtons (N), while full six-axis force/torque estimates showed relative errors between 8 % and 24 % compared to a high-precision reference sensor.

Notably, this level of accuracy required just 31 to 141 simple calibration samples, far fewer than the large training datasets typically needed for deep learning-based approaches.

A softer “fine” finger version was also tested, offering better sensitivity for detecting smaller forces. This came with a slight trade-off in precision, with relative errors ranging from 10% to 29%.

Two practical demonstrations showed how the system performs in different scenarios.

In one, Seezer autonomously carried out hour-long pick-and-place operations with small gears, using its built-in camera for visual feedback. In another, it successfully picked a delicate redcurrant berry, stopping finger closure as soon as its estimated grip force reached 400 millinewtons. This was enough to hold the fruit without causing damage.

That said, some limitations remain.

The system currently depends on stable lighting and a uniform background for reliable marker tracking. Adapting it to more dynamic, unstructured environments will require improved tracking robustness. Processing speed is also limited by the camera and onboard computer. And while the disposable 3D-printed fingers are low-cost and easy to swap, more work is needed to assess how well they hold up under repeated use and long-term material fatigue.

Conclusion

The Seezer gripper provides a compact, affordable approach to force-sensitive robotic manipulation. Its design, featuring a compliant 3D-printed finger with an embedded miniature camera, enables accurate force sensing through a calibration process that requires only minimal data.

Demonstrated success in both rigid object handling and delicate tasks like berry harvesting highlights the system’s adaptability.

While improvements in tracking robustness and processing speed are still needed, this work marks a step toward practical, low-cost grippers suited for delicate manipulation in flexible production environments. 

Journal Reference

Duverney, C., Gerig, N., Dieter Hüls, Niemeyer, C., Cattin, P. C., & Rauter, G. (2026). Affordable 3D-printed miniature robotic gripper with integrated camera for vision-based force and torque sensing. Npj Robotics, 4(1). DOI:10.1038/s44182-026-00075-2. https://www.nature.com/articles/s44182-026-00075-2

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Nandi, Soham. (2026, January 27). This Robot Finger Changes How Machines Interact With the World. AZoRobotics. Retrieved on January 27, 2026 from https://www.azorobotics.com/News.aspx?newsID=16312.

  • MLA

    Nandi, Soham. "This Robot Finger Changes How Machines Interact With the World". AZoRobotics. 27 January 2026. <https://www.azorobotics.com/News.aspx?newsID=16312>.

  • Chicago

    Nandi, Soham. "This Robot Finger Changes How Machines Interact With the World". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=16312. (accessed January 27, 2026).

  • Harvard

    Nandi, Soham. 2026. This Robot Finger Changes How Machines Interact With the World. AZoRobotics, viewed 27 January 2026, https://www.azorobotics.com/News.aspx?newsID=16312.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.