Controlling Multiple Robots Simultaneously Through VR

Scientists have designed a new tool that enhances the workload of robot operators. With the help of virtual reality (VR), the tool gives the user greater control over the robots in their command.

A researcher immersed in a mission. Credit: UPM

The work was carried out by scientists from the Robotics & Cybernetics Research Group (RobCib) at the Centre for Automation and Robotics (CAR).

The tool allows the operator to walk about and find the best spot to observe and correctly guide the robots while performing set tasks.

The interface provides operators improves awareness of the robots' surroundings and reduces workload.

Robot operators have to cope with a massive workload: to understand the information of the mission, make decisions, and form commands for the robots.

Moreover, they have to be conscious of the situation, comprehend the situation from the robot data, and realize where they are, and what they are doing at any given time.

Today the missions require more operators than robots: there is an operator to control the motion and another operator to assess the data robot. The goal is to simultaneously control various robots with an only operator."

Juan Jesús Roldán, Co-developer

A team of scientists from CAR (UPM-CSIC) developed the VR interface for the tool, which uses machine learning methods to evaluate the potential damage for each robot during the mission like a human operator.

At the same time, the information is shown to the operator: for example, a light that points to the drone while the drone is performing the most crucial task.

The scientists performed a set of tests to evaluate the new interface and to associate it with a conventional interface.

The next steps for the research team are to incorporate a greater diversity of scenarios (indoor and outdoor) and robots (aerial and terrestrial), and test new approaches to allow operators to effortlessly and intuitively direct the robots.

This study was conducted as part of the SAVIER project (Situational Awareness VIrtual EnviRonment) of Airbus and is a collaboration between Universidad Politécnica de Madrid and the University of Luxembourg. The development was recently published in the journal Sensors.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.