Low-Cost, High-Precision Technology for Robot Motion Capture

In an article recently published in the journal Drones, researchers proposed an affordable, open-source, and easy-to-use robotic motion capture system, Easy Rocap, for unmanned platforms such as drones.

Low-Cost, High-Precision Technology for Robot Motion Capture

Study: Advancing Robot Motion Capture with Low-Cost, High-Precision Technology. Image credit: Den Rozhnovsky/Shutterstock.com

Limitations of Existing Approaches

Rapid technological advancements in robotics systems have greatly facilitated the progress of unmanned platforms like unmanned aerial vehicles (UAVs). In robotics applications, pose estimation plays a critical role in trajectory planning and execution, as a robot cannot build an accurate spatial representation of the environment without correctly estimating velocity and position.

Specifically, real-time pose information is critical for many mobile robots. For instance, accurate and fast pose estimation is crucial for local motion control of drones. Camera-based motion capture (Mocap) systems display a good performance in robotics applications. Currently, robots utilize these systems extensively.

However, most camera-based Mocap systems, like infrared camera-based systems, are easily affected by camera occlusion and light noise. Moreover, common commercial Mocap systems are expensive and lack open-source hardware and software, further increasing the challenges of deploying them on robots.

The Proposed Solution

In this research, the authors introduced Easy Rocap, a straightforward, open-source, and affordable robotic motion capture solution designed to overcome the issues associated with inaccessible source codes of high-precision motion capture (Mocap) systems, the prohibitive expense of commercial Mocap setups, and interference from ambient light.

Easy Rocap can quickly and effectively capture/estimate the accurate orientation and position of drones in real-time using special material markers as the mobile robot's tracking objects and several fixed cameras to perform three-dimensional (3D) intersections.

Unlike the infrared camera-based Mocap systems, the proposed system employs multi-object tracking (MOT) and object filtering algorithms that combine object detection technology to ensure precise and continuous trajectories considering the potential obstacles and noise.

The Easy Rocap system was developed using consumer-grade commercial IR RealSense cameras and comprised three key components, including multi-view camera triangulation, multi-view correspondences, and a two-stage key points detection model (TKDM).

The TKDM with fine filtering was introduced to mitigate the effects of environmental light noise on visual detection.

This innovative approach integrates MOT and object detection within a multi-view correspondence framework. The adoption of a dual-layer detector significantly enhances object detection capabilities in intricate environments, while the MOT component ensures stable tracking of markers even in situations of brief occlusion.

Moreover, the method leverages geometric constraints within the multi-view correspondences to match corresponding points across images, ensuring accurate and reliable detection and tracking under a variety of challenging conditions.

Initially, a trained object detector was used to generate bounding boxes of markers and robots in each view. An object-filtering algorithm using class and confidence was designed based on training a real-time object detector to eliminate false detections.

Weights were added to matrix coefficients of multiple views based on confidence to reduce the occlusion effects from a certain view on the 3D pose estimation results, ensuring that contributions from every view were reasonable. Then, MOT was applied to maintain the trajectories' continuity, and the epipolar constraint was applied to multi-view correspondences.

Ultimately, the system utilized calibrated multi-view cameras to compute the 3D coordinates of the markers, enabling the accurate determination of the target robot's three-dimensional pose. The orientation and position of the robot were deduced from the coordinates of several markers. This proposed system is designed for seamless integration into robotic systems, capitalizing on real-time data streams from multiple cameras to facilitate precise and dynamic pose estimation.

Researchers performed extensive experiments in real scenes and simulated environments using multiple cameras to evaluate the performance/precision and rapidity of the proposed motion capture system. They also investigated whether the system can offer real-time pose information to robots. UAVs and unmanned ground vehicles (UGVs) were used for the experiments.

Significance of the Work

The Easy Rocap system demonstrated fast speed and high precision for pose estimation. In the simulation scenario experiment, the proposed system's average position estimation error was less than 0.008 m, and the average orientation error was less than 0.65 o without any obstruction in the simulation environment.

These outcomes indicated the system's ability to provide high-precision real-time and consistent position data. However, the system's average position estimation error increased to 1.32 cm/0.0132 m during the fast drone flight around the obstacle.

In a practical experiment, researchers evaluated the performance of a new localization method against that of a sophisticated Light Detection and Ranging (LiDAR)-inertial Simultaneous Localization and Mapping (SLAM) algorithm. They discovered that while the SLAM technique tended to produce drifts during turns, the newly proposed Easy Rocap method effectively countered these accumulated errors and drifts, resulting in a more accurate and stable trajectory. This demonstrates the system's practical viability. Additionally, the system was capable of achieving pose estimation speeds of up to 30 Hz.

To summarize, the findings of this study demonstrated the feasibility of using the proposed Easy Rocap system for accurately capturing the orientation and location of an unmanned platform in complex scenes.

Journal Reference

Wang, H., Chen, C., He, Y., Sun, S., Li, L., Xu, Y., Yang, B. (2024). Easy Rocap: A Low-Cost and Easy-to-Use Motion Capture System for Drones. Drones, 8(4), 137. https://doi.org/10.3390/drones8040137, https://www.mdpi.com/2504-446X/8/4/137

Bethan Davies

Written by

Bethan Davies

Bethan has just graduated from the University of Liverpool with a First Class Honors in English Literature and Chinese Studies. Throughout her studies, Bethan worked as a Chinese Translator and Proofreader. Having spent five years living in China, Bethan has a profound interest in photography, travel and learning about different cultures. She also enjoys taking her dog on adventures around the Peak District. Bethan aims to travel more of the world, taking her camera with her.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Davies, Bethan. (2024, April 10). Low-Cost, High-Precision Technology for Robot Motion Capture. AZoRobotics. Retrieved on April 29, 2024 from https://www.azorobotics.com/News.aspx?newsID=14760.

  • MLA

    Davies, Bethan. "Low-Cost, High-Precision Technology for Robot Motion Capture". AZoRobotics. 29 April 2024. <https://www.azorobotics.com/News.aspx?newsID=14760>.

  • Chicago

    Davies, Bethan. "Low-Cost, High-Precision Technology for Robot Motion Capture". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=14760. (accessed April 29, 2024).

  • Harvard

    Davies, Bethan. 2024. Low-Cost, High-Precision Technology for Robot Motion Capture. AZoRobotics, viewed 29 April 2024, https://www.azorobotics.com/News.aspx?newsID=14760.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.