Novel Intelligent Handheld Robot Developed at University of Bristol

What if handheld tools know what needs to be done and were even able to guide and help inexperienced users to complete jobs that require skill? Researchers at the University of Bristol have developed and started studying a novel concept in robotics - intelligent handheld robots.

Novel intelligent handheld robot

Historically, handheld tools have been blunt, unintelligent instruments that are unaware of the context they operate in, are fully directed by the user, and critically, lack any understanding about the task they are performing.

Dr Walterio Mayol-Cuevas and PhD student, Austin Gregg-Smith, from the University’s Department of Computer Science, have been working in the design of robot prototypes as well as in understanding how best to interact with a tool that “knows and acts”. In particular, they have been involved with comparing tools with increasing levels of autonomy.

Compared to other tools such as power tools that have a motor and perhaps some basic sensors, the handheld robots developed at Bristol are designed to have more degrees of motion to allow greater independence from the motions of the user, and importantly, are aware of the steps being carried out. This allows for a new level of co-operation between user and tool, such as the user providing tactical motions or directions and the tool performing the detailed task.

Handheld robots, aim to share physical proximity with users but are neither fully independent as is a humanoid robot nor are part of the user’s body, as are exoskeletons. The aim with handheld robots is to capitalise on exploiting the intuitiveness of using traditional handheld tools while adding embedded intelligence and action to allow for new capabilities.

Dr Mayol-Cuevas, Reader in Robotics Computer Vision and Mobile Systems, said: “There are three basic levels of autonomy we are considering: no autonomy, semi-autonomous when the robot advises the user but does not act, and fully autonomous when the robot advises and acts even by correcting or refusing to perform incorrect user actions.”

Chapter 3,4: The Design and Evaluation of a Cooperative Handheld Robot (ICRA 2015)

The Bristol team has been studying user’s task performance and user preferences on two generic tasks: pick and drop of different objects to form tile patterns, and aiming in 3D for simulated painting.

Austin Gregg-Smith, a PhD student who is sponsored by the James Dyson Foundation, added: “Our results indicate that users tend to prefer a tool that is fully autonomous and there is evidence of a significant impact on completion time and reduced perceived workload for autonomous handheld. However, users sometimes also expressed how different it is to work with this type of novel robot.”

The researchers are currently investigating further topics on interaction, shared intelligence and new applications for field tasks, and due to the difficulties of starting in a new area of robotics, their robot designs are open source and available via http://www.handheldrobotics.org/

A paper about their recent work, which has been nominated for Best Cognitive Robotics Paper Award, Best Student Paper Award and Best Conference Paper Award, will be presented at this week’s IEEE International Conference on Robotics and Automation (ICRA).

Source: http://www.bristol.ac.uk

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.