Posted in | News | Mechatronics

3D Printed Robotic ‘Elephant Trunk’ is Guided by Machine Learning

Taking their inspiration from nature, researchers have designed a low-cost robotic arm that mimics the strength and flexibility of an elephant’s trunk.

Image Credit: Villiers Steyn/Shutterstock.com

In terms of design, the elephant's trunk is one of evolution’s foremost triumphs. The appendage — an extension of the animal’s nose and its upper lip — is both strong and flexible with a level of dexterity that is unmatched, even by our own arms.

In fact, containing an immense 40,000 muscles, an elephant’s trunk contains considerably more muscles than a human’s entire body. 

It is little wonder then that the elephant’s trunk has not just made the creatures one of the most beloved of wild animals, but it has also been eyed enviously by the designers of robotic devices. Especially those designed to grip and manipulate objects.

A team of scientists from the University of Tübingen and the Graz University of Technology is just one such group. They have devised a low-cost, modular robotic arm that can mimic the movements of an elephant's trunk. 

The arm can be 3D printed and features a gripper at its tip, just like the elephant’s trunk hosts a dexterous and surprisingly sensitive ‘finger’ at its tip. The researchers, including Dr. Sebastian Otte, a post-doc researcher in the Cognitive Modeling group of the University of Tübingen, printed their design as a low-cost proof of concept. 

“We constructed two different trunk-like robotic arms, which are not only easy to produce using standard desktop 3D-printers, with a total material cost of less than 500 Euros per robot, they are also controlled with SNNs,” says the team in their research paper. 

With this prototype, they performed a series of tests that included grasping marbles, picking them up, and gently placing them upon podiums. They suggest that the arm’s success in these tricky demonstrations coupled with the fact that its production is scalable point towards a potential future on industrial production lines, transportation of parts, and even in the delicate task of assembling electronics.

The team's research is detailed in the paper ‘Many-Joint Robot Arm Control with Recurrent Spiking Neural Networks.’¹

The Long and Short of Modular Design

The key to granting the team’s robot arm with the flexibility of an elephant's trunk lies in the device’s unique modular design.

The team created a set of stackable joint pieces, each of which has three possible directions of movement, or degrees of freedom. They then produced the prototype design with ten of these individual units. 

Motors within each unit are capable of directing gears to tilt the segment as much as 40⁰ in two axes at one time. This results in an impressive range of movement and the team say that completed robots using their modules could be doubled in size to consist of as many as 20 units. 

Of course, flexing and bending are not the only types of movements available to the trunk of an elephant. These animals can also shorten and lengthen their trunks thanks to the action of the muscles within them. 

And this is another property of the trunk that the team’s robot can mimic. 

Whilst a robot arm with such impressive flexibility is certainly an advantage, controlling a device with a high level of freedom of movement is no small task.

Fortunately, the team was able to apply machine learning principles to address this significant challenge. 

A Neural Network to Control an Elephant’s Trunk

In order to control their robot elephant trunk-like arm and keep track of its intimidating range of movement, the team turned to an artificial neural network that closely resembles brain processes. These so-called spiking neural networks (SNNs) can not just incorporate neuronal and synaptic states, but can also factor time into their calculations. 

The team took their SNN and had it observe a series of movements. The SNN mapped these motions to motor action and corresponding robot positions. This training procedure resulted in a series of models that could then be applied to ‘real-world’ navigation, thus performing specific tasks with an impressive degree of precision.

“We did not only show that it is possible to construct low-cost trunk-like robotic arms with basic 3D-printing equipment, but we also demonstrated how they can be controlled using the latest recurrent spiking neural network architectures,” say the paper’s authors. “Our method allows the precise goal-direction steering of these robots with near millimeter tolerance and it is able to handle even extraordinary complex robot arms with up to 75 articulated degrees of freedom.”

As for future developments, the team says that the next steps for the robot arm are looking to directly map the motor commands on the SNN without the need for training. They also express the desire to incorporate radar-based distance sensors that can help the robot arms avoid collisions. 

“This could also allow the robots to be used in soft robotic applications, where they might work together with humans in mutually beneficial interaction,” say the authors.

The team also expresses interest in using the SNN they have pioneered in mimicking another member of the animal kingdom for potentially life-saving uses. “Another interesting research path would be the integration of this method into snake-like robots instead of stationary robotic arms, where they could be employed in search and rescue operations.”

References and Further Reading

¹ Traub. M., Legenstein. R., Otte. S, [2021], ‘Many-Joint Robot Arm Control with Recurrent Spiking Neural Networks,’ [https://arxiv.org/pdf/2104.04064.pdf]

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Robert Lea

Written by

Robert Lea

Robert is a Freelance Science Journalist with a STEM BSc. He specializes in Physics, Space, Astronomy, Astrophysics, Quantum Physics, and SciComm. Robert is an ABSW member, and aWCSJ 2019 and IOP Fellow.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Lea, Robert. (2021, April 21). 3D Printed Robotic ‘Elephant Trunk’ is Guided by Machine Learning. AZoRobotics. Retrieved on April 27, 2024 from https://www.azorobotics.com/News.aspx?newsID=12189.

  • MLA

    Lea, Robert. "3D Printed Robotic ‘Elephant Trunk’ is Guided by Machine Learning". AZoRobotics. 27 April 2024. <https://www.azorobotics.com/News.aspx?newsID=12189>.

  • Chicago

    Lea, Robert. "3D Printed Robotic ‘Elephant Trunk’ is Guided by Machine Learning". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=12189. (accessed April 27, 2024).

  • Harvard

    Lea, Robert. 2021. 3D Printed Robotic ‘Elephant Trunk’ is Guided by Machine Learning. AZoRobotics, viewed 27 April 2024, https://www.azorobotics.com/News.aspx?newsID=12189.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.