It can be challenging and time-intensive to find lost hikers in forests since drones and helicopters will be unable to sight them through the thick tree canopy. In the recent past, hypotheses have been made that autonomous drones with the potential to bob and weave through trees could help in these searches. However, the GPS signals used to guide the aircraft can be nonexistent or unreliable in forest environments.
MIT researchers describe an autonomous system for a fleet of drones to collaboratively search under dense forest canopies using only onboard computation and wireless communication—no GPS required. (Image credit: Melanie Gonick)
In a paper to be presented at the International Symposium on Experimental Robotics conference next week,
MIT scientists have described an autonomous system that enables a fleet of drones to collaboratively search under dense forest canopies. GPS is not required since the drones use just wireless communication and onboard computation.
Laser-range finders for position estimation, localization, and path planning are provided in each of the autonomous quadrotor drones. While flying around, the drone creates an individual 3D map of the terrain. Algorithms help the drone to identify unexplored and already-searched spots, thereby knowing when an area has been fully mapped. Individual maps from multiple drones are fused by an off-board ground station into a global 3D map, which can be monitored by human rescuers.
In a real-world execution, but not in the existing system, the drones would be provided with object detection to recognize a missing hiker. Upon locating the hiker, the drone would tag the location of the hiker on the global map. Then, this information can be used to plan a rescue mission.
Essentially, we’re replacing humans with a fleet of drones to make the search part of the search-and-rescue process more efficient,” stated first author Yulun Tian, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro).
Multiple drones were tested by the researchers in simulations of randomly created forests, and two drones were tested in a forested area within NASA’s Langley Research Center. In both tests, each drone mapped approximately 20 m
2 area in about two to five minutes and collaboratively fused their maps together in real-time. The drones also exhibited optimal performance across various metrics, such as detection of forest features, overall speed and time to complete the mission, and precise merging of maps.
Katherine Liu, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and AeroAstro; Kyel Ok, a PhD student in CSAIL and the Department of Electrical Engineering and Computer Science; Loc Tran and Danette Allen of the NASA Langley Research Center; Nicholas Roy, an AeroAstro professor and CSAIL researcher; and Jonathan P. How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics are the co-authors of the paper.
(Video credit: MIT)
Exploring and mapping
A LIDAR system was mounted on each drone and created a 2D scan of the obstacles around by shooting laser beams and measuring the reflected pulses, which can be used to detect trees. However, individual trees appear astonishingly similar to drones. In case a drone is unable to identify a given tree, it will be unable to determine whether it has already explored an area.
The drones were programmed to instead recognize orientations of multiple trees, which is much more typical. Using this technique, when a cluster of trees is returned by the LIDAR signal, the angles and distances between trees are calculated by an algorithm to recognize that cluster. “
Drones can use that as a unique signature to tell if they’ve visited this area before or if it’s a new area,” stated Tian.
This feature-detection method allows the ground station to accurately merge maps. In general, the drones explore an area in loops, thereby producing scans as they travel. The scans are continuously monitored by the ground station. In case two drones loop around to the same cluster of trees, the maps are merged by the ground station by calculating the relative transformation between the drones and then fusing the individual maps to maintain consistent orientations.
Calculating that relative transformation tells you how you should align the two maps so it corresponds to exactly how the forest looks,” stated Tian.
In the ground station, robotic navigation software known as “simultaneous localization and mapping” (SLAM), which has the ability to map an unknown area as well as keep track of an agent inside the area, applies the LIDAR input to localize and capture the drones’ position. This helps in the accurate fusing of the maps.
The outcome is a map containing 3D terrain features. Based on the height, trees appear as blocks of colored shades of blue to green. Initially, unexplored areas appear dark; however, they turn gray once they are mapped by a drone. A drone is instructed by an on-board path-planning software to always explore the dark unexplored areas when it flies around. According to Tian, it is more reliable to generate a 3D map than just attach a camera to a drone and monitor the video feed. For example, a lot of bandwidth is needed for the transmission of video to a central station, which may not be available in forested areas.
More efficient searching
A major variation is an innovative search strategy that allowed the drones to more efficiently explore an area. Based on a more conventional strategy, a drone would always explore the closest possible unfamiliar area. However, that can be in any number of directions from the current position of the drone. In general, the drone flies a short distance and then stops to choose a new direction.
That doesn’t respect dynamics of drone [movement],” stated Tian. “ It has to stop and turn, so that means it’s very inefficient in terms of time and energy, and you can’t really pick up speed.”
Alternatively, the drones developed by the team explore the closest possible area by taking their speed and direction into account and maintaining a consistent velocity. This approach, in which the drone tends to travel in a spiral pattern, covers a search area considerably faster. “
In search and rescue missions, time is very important,” stated Tian.
In the paper, the innovative search strategy was compared with a conventional technique. In contrast to that baseline, the team’s strategy helped the drones cover considerably more area, with higher average speeds and several minutes faster.
One disadvantage in practical application is that the drones still have to communicate with an off-board ground station for merging of maps. In the outdoor experiment, the team had to configure a wireless router that linked the ground station with each drone. In the future, the researchers are looking forward to enabling the drones to wirelessly communicate while approaching one another, fuse their maps, and then disconnect communication when they separate. In that case, the ground station would have to just monitor the updated global map.