Posted in | News

Touch-sensitive, Real-time Robotic Arm Developed

Willyam Bradberry/ Shutterstock.com

Engineers at Keio University in Japan have created a robotic arm capable of transmitting audio, video and highly-sensitive touch sensations to remote users.

This ground-breaking robotic technology, described in a report published in the journal IEEE Transactions on Industrial Electronics, could have countless applications in industrial manufacturing, agriculture and medicine.

Robotics Researchers are currently investigating solutions for a wide array of modern problems, including better ways to care for seniors in quickly-aging nations like Japan, easing labor-intensive agriculture operations and addressing extreme emergency situations in which humans cannot directly search or attempt rescue operations, such as during nuclear disasters.

To address these issues, many Robotics Scientists are looking into the idea of ‘haptics’, a way to convey information using touch. In its most basic form, haptics uses vibrations to provide information to users. For instance, the Apple Watch can alert a wearer of a new text message through a vibration pattern the wearer can feel and recognize.

Cutting-edge haptics technology is based largely on touch sensors, which can be challenging to calibrate and quite often breakdown in extreme conditions like heat and radiation. Moreover, typical haptics technology uses vibrations and is therefore not a complete replacement for human tactile sensations. Therefore, haptics is currently useful for basic communication and entertainment, but can be quite limited in industrial applications.

Automated robotic arms are commonly used in industry, most popularly for repetitive motions in automobile production lines. However, these arms only repeatedly perform a pre-programmed combination of commands, grasping well-defined, solid parts used for developing cars.

The challenge for Robotics Engineers is to create a system that can identify the shape, firmness and location of an item, and control it based on real-time instructions from a user situated away from the arm, essentially turning the arm into a real-time avatar.

Scientists at the Science and Technology and Haptics Research Center at Keio University could have possible expanded the use of robotics by developing an avatar robotic system with an arm capable of transmitting audio, video, movement and a delicate sense of touch via pressure to a remotely situated user in real time.

This ‘real-haptics’ is an integral part of the Internet of Actions (IoA) technology, having applications in manufacturing, agriculture, medicine, and nursing care.

Takahiro Nozaki, Study Author and Researcher, Keio University

The device is also capable of recording human movements, editing them and replicating them. Also, this arm does not make use of typical touch sensors, in so doing, it can be cheaper to make, smaller and more versatile when it comes to malfunction and noise, the Researchers said.

The main technology powering this avatar-robot is a high-precision motor, several of which have been integrated in the robot arm. Highly-accurate control of pressure and position is key for conveying a sense of touch without making use of touch sensors.

To market the technology, the robotics team have launched a company called ‘Motion Lib’. Currently, the company's primary product is an integrated chip known as the ‘ABC-CORE’ IC force/tactile controller. This chip governs the force changes of DC/AC servomotors and tactile transmission with two synchronized motors. Also, because the load pressure put on the motor is determined by an algorithm in the chip, it is not essential to install pressure or torque sensors.

The report on the new haptic arm comes just after Robotics Researchers from the University of Washington announced the development of a flexible artificial skin capable of transmitting information about shear forces and vibrations that it detects. The Researchers behind this breakthrough said that the skin could be stretched over a robot’s body, giving the robot added touch capabilities.

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Brett Smith

Written by

Brett Smith

Brett Smith is an American freelance writer with a bachelor’s degree in journalism from Buffalo State College and has 8 years of experience working in a professional laboratory.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Smith, Brett. (2017, November 07). Touch-sensitive, Real-time Robotic Arm Developed. AZoRobotics. Retrieved on April 27, 2024 from https://www.azorobotics.com/News.aspx?newsID=9564.

  • MLA

    Smith, Brett. "Touch-sensitive, Real-time Robotic Arm Developed". AZoRobotics. 27 April 2024. <https://www.azorobotics.com/News.aspx?newsID=9564>.

  • Chicago

    Smith, Brett. "Touch-sensitive, Real-time Robotic Arm Developed". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=9564. (accessed April 27, 2024).

  • Harvard

    Smith, Brett. 2017. Touch-sensitive, Real-time Robotic Arm Developed. AZoRobotics, viewed 27 April 2024, https://www.azorobotics.com/News.aspx?newsID=9564.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.