Scientists have designed a new tool that enhances the workload of robot operators. With the help of virtual reality (VR), the tool gives the user greater control over the robots in their command.
A researcher immersed in a mission. Credit: UPM
The work was carried out by scientists from the Robotics & Cybernetics Research Group (RobCib) at the Centre for Automation and Robotics (CAR).
The tool allows the operator to walk about and find the best spot to observe and correctly guide the robots while performing set tasks.
The interface provides operators improves awareness of the robots' surroundings and reduces workload.
Robot operators have to cope with a massive workload: to understand the information of the mission, make decisions, and form commands for the robots.
Moreover, they have to be conscious of the situation, comprehend the situation from the robot data, and realize where they are, and what they are doing at any given time.
Today the missions require more operators than robots: there is an operator to control the motion and another operator to assess the data robot. The goal is to simultaneously control various robots with an only operator."
Juan Jesús Roldán, Co-developer
A team of scientists from CAR (UPM-CSIC) developed the VR interface for the tool, which uses machine learning methods to evaluate the potential damage for each robot during the mission like a human operator.
At the same time, the information is shown to the operator: for example, a light that points to the drone while the drone is performing the most crucial task.
The scientists performed a set of tests to evaluate the new interface and to associate it with a conventional interface.
The next steps for the research team are to incorporate a greater diversity of scenarios (indoor and outdoor) and robots (aerial and terrestrial), and test new approaches to allow operators to effortlessly and intuitively direct the robots.
This study was conducted as part of the SAVIER project (Situational Awareness VIrtual EnviRonment) of Airbus and is a collaboration between Universidad Politécnica de Madrid and the University of Luxembourg. The development was recently published in the journal