To navigate and ‘see’ the environment around them, flying drones typically use GPS systems and bulky laser scanners, methods that work best in bright, outdoors settings.
However, a new navigation system based on a state-of-the-art camera that functions similar to the human eye allows drones to perform quick, nimble maneuvers and in indoor, low-light environments, according to a new report in the journal IEEE Robotics and Automation Letters.
In the study, Researchers from the University of Zurich (UZH) and the Swiss-based NCCR Robotics described how their novel system is based on an ‘event camera’ – which can see in near-dark conditions far more effectively than customary cameras currently utilized by all off-the-shelf drones.
This research is the first of its kind in the fields of artificial intelligence and robotics, and will soon enable drones to fly autonomously and faster than ever, including in low-light environments.
Davide Scaramuzza, Study Author and Director of the Robotics and Perception Group, UZH
According to their report, the study team has been able to use the camera to calculate a drone’s location and orientation in space as it is flying.
Event cameras do not need to full light on their robotic 'retina' to be able to create a complete image. Unlike conventional cameras, these devices document shifts in brightness for each pixel, leading to very sharp vision, even during rapid motion or in low-light surroundings. The UZH Scientists said they have developed new software capable of effectively processing output from events cameras, utilizing this information to make it possible for autonomous flight at high speeds and in lower light than presently achievable with commercial drones.
Drones built with an event camera and the software described in the study could help search and rescue missions in situations where conventional drones would not be useful, such as during dusk or inside of a building without power for lights. The navigation system would also be capable of flying faster than conventional drones into disaster areas, where time is vital to saving as many survivors as possible.
There is still a lot of work to be done before these drones can be deployed in the real world since the event camera used for our research is an early prototype. We have yet to prove that our software also works reliably outdoors.
Henri Rebecq, Co-author and PhD student, UZH
“We think this is achievable, however, and our recent work has already demonstrated that combining a standard camera with an event-based camera improves the accuracy and reliability of the system,” Scaramuzza added.
Last year, the same group of Researchers from UZH showed they were able to successfully send quadcopter drones through narrow gaps at multiple angles using an automated navigation system that is completely onboard the drone itself.
Using an onboard camera and computer with the processing power approximately equal to that of a smartphone, the drones developed by the team were able to tilt slightly and pass through a rectangular gap set at various angles, including a 30 degree angle and 45 degree angle. The gaps were about 1.5 times the size of the robot and provided just 10 centimeters of clearance.
After slightly pitching to one side and flying through an angled gap, an onboard downward-facing sensor and camera were used to stabilize the drone. The Swiss team posted a video of professional drone racers attempting similar maneuvers, which were often unsuccessful.
At the time, Scaramuzza said this maneuvering system could be adapted to allow quadcopter drones to autonomously navigate other tricky obstacles, including tree branches or sign posts.
concept w/ Shutterstock.com