As technology continues to advance the intricacy of tasks that robots can perform, many more industries will begin to integrate these systems into their daily production processes. Thus, it is imperative that robots, particularly those that can autonomously move and perform tasks, can safely detect moving objects and people in their surrounding environment.
Image Credit: Chesky/Shutterstock.com
Safety of Robots in the Workplace
Robotic systems are incorporated into a wide range of manufacturing processes within medicine, home, warehouses, restaurants, hotels, agriculture, and other industries. As the role of robots within these sectors is expected to expand in the future, manufacturers have become increasingly interested in how they can incorporate collaborative robots into their production processes.
Various safety standards can be followed to ensure the appropriate integration of mobile robots into any shared workspace between human operators and robotic systems. Some static approaches include limiting the power and force of robot manipulators to levels that would not be harmful to human operators, the use of hand-guiding robots that can only perform tasks when operated by a human, as well as physical or virtual dividers to prevent robotic systems from entering the workplace of human operators.
Compared to these static approaches, there has been a recent shift towards adapting dynamic approaches to safety zoning within manufacturing settings. Dynamic safety zoning would involve using different sensor technologies that would adapt the robot's behavior according to the human operator's proximity. In addition to protecting the human operator from potential hazards, such dynamic systems would significantly minimize unwanted downtime in these workplaces.
What is a LiDAR Sensor?
To ensure the safe and reliable performance of mobile robots in their surrounding dynamic environment, as well as during their interactions with other robots, sensors are used to monitor their autonomous movements.
Light detection and ranging (LiDAR) sensors, for example, rely on either two-dimensional (2D) or three-dimensional (3D) point clouds that consist of numerous data points to acquire positional information on an object within its range. These point clouds utilize machine learning approaches such as triangulation-based spatial clustering to map workspaces and determine the distance between the robot and its surrounding environment.
LiDAR sensors are often categorized as either navigation or obstacle avoidance sensors. Whereas navigation-based LiDAR sensors are typically used to construct a map of the surrounding environment, obstacle avoidance LiDAR sensors are more frequently installed into the body of mobile robots to detect and prevent collision of the system with an approaching object. Although obstacle avoidance LiDAR sensors have a lower technical threshold than navigation LiDAR sensors, it is superior in target recognition capabilities.
Image Credit: Blue Planet Studio/Shutterstock.com
LiDAR Sensors and Mobile Robots
Although robotic systems equipped with LiDAR sensors can easily compute the distance between the robot and static objects, it becomes more challenging for these sensors to determine their distance from a moving object or person, particularly when the robot is also mobile.
This is because the point locations that would normally be used by the sensor are moving, thus preventing the device from having a consistent reference location. To overcome this challenge, point locations that can change with different point clouds to reflect moving objects are essential.
3D LiDAR Simultaneous Localization and Mapping (SLAM) is a novel method that scans the surrounding environment and triangulates relevant points from cloud points. As the sensor acquires new distance measurements, it can quickly compare this information to existing reference and non-coherent localization points.
Visual SLAM, which relies on camera images to obtain measurements on nearby objects, various environmental factors such as different lighting, textures and angle variations can limit the accuracy of camera images.
Conversely, LiDAR SLAM algorithms have been developed to overcome these limitations to provide high-quality distance information in the form of point clouds on objects in the surrounding environment.
Recently, researchers assessed the accuracy of SLAM combined with a zero-measurement approach to interact with a moving unmanned vehicle (UV). As the UV moved, a 'zero-point' was assigned to every point in a new point cloud. Despite the potential of this combined approach, the researchers found that the system could not make new zero points for the UV as other objects moved in and out of its spatial environment.
Another proposed approach would involve the ability of the sensor to determine the distance between multiple points in a single measurement. This would create point clusters that are created every time a different object is detected in the robot's spatial environment.
The point cluster principle has recently been tested using the Hokuyo UST-20LX 2D LiDAR sensor. Herein, the researchers transformed the detected 2D point clusters into 3D cloud points that provide accurate information on the distance between the moving robot and nearby human operators.
Commercially Available LiDAR Sensors for Robots
Numerous companies offer LiDAR sensors that can easily be incorporated into a wide range of robotic applications. Active Robotics, for example, provides several different LiDAR sensors, including the Yujin YRL3 family of sensors. Neuvition also offers a line of LiDAR products, including the Titan S2 which has been specifically designed for indoor close-range applications.
References and Further Reading
Chemweno, P., & Torn, R. (2022). Innovative safety zoning for collaborative robots utilizing Kinect and LiDAR sensory approaches. 9th CIRP Conference on Assembly Technology and Systems 106; 209-214. doi:10.1016/j.procir.2022.02.180.
Chikurtev, D., Chivarov, N., Chivarov, S., & Chikurteva, A. (2021). Mobile robot localization and navigation using LIDAR and indoor GPS. IFAC-PapersOnLine 54(13); 351-356. doi:10.1016/j.ifacol.2021.10.472.
Su, Y., Wang, T., Shao, S., et al. (2021). GR-LOAM: LiDAR-based sensor fusion SLAM for ground robots on complex terrain. Robotics and Autonomous Systems 140. doi:10.1016/j.robot.2021.103759.
LiDAR Sensors [Online]. Available from: https://www.active-robots.com/mobile-robots/lidar-sensors.html.
LiDAR for Logistics Mobile Robots [Online]. Available from: https://www.neuvition.com/media/blog/logistics-mobile-robots.html.