A new study has discovered that users face difficulty while utilizing images from unmanned aerial systems (UASs), or drones, to locate the position of objects on the ground. The finding throws light on challenges facing the application of UAS technology for emergency operations and other applications, while offering direction for future technology and training development.
Because UASs operate at heights that most normal aircraft do not, we are getting new aerial perspectives of our surroundings. We wanted to know how good people are at integrating these perspectives into their perception of the real world environment – which can be relevant in situations such as security or emergency response operations. For example, if we’re using UASs to identify a trouble spot, how good are we at using visual information from UASs to point to the correct spot on a map?
Stephen Cauffman, Ph.D. Student, North Carolina State University and Lead Author
To resolve this, Researchers had a group of 18 study participants assess various views of an urban environment that included several objects. In one scenario, an aerial view of the environment was shown to the participants, followed by a ground level view of the same environment with one object missing. The participants were then asked to point to where the missing object had been positioned. The study also made participants perform similar tasks comparing two ground images, two aerial images and a ground image followed by an aerial image.
The Researchers discovered that comparing two aerial views got the top results, but that swapping from an aerial view to a ground view posed the greatest challenge for the study participants. When an aerial view followed by a ground view was exhibited, participants took at least a second longer to guess where the missing object was – and their guesses were four times farther away from the precise placement of the object than when comparing two aerial views.
This tells us that incorporating UASs into some situations, such as emergency response, may not necessarily be as useful as one might think.
Doug Gillan, Professor of psychology, NC State and Co-author of the paper
“It also offers insights into how we can modify training or interface design to improve performance for UAS users,” Cauffman says.
“A lot of work remains to be done in this area,” Cauffman adds. “We’ve already conducted additional work on the role of landmarks and perspective in how people are able to process aerial visual information.”
The paper, “Eye In The Sky: Investigating Spatial Performance Following Perspective Change,” is to be presented at the Annual Meeting of the Human Factors and Ergonomics Society, held between October 9th and 14th in Austin, Texas.