AI Computer Vision Spots Poachers in Near Real-Time

Several animals including elephants, rhinos, tigers, and gorillas are poached yearly. For a long time, researchers at the USC Center for Artificial Intelligence in Society have been applying artificial intelligence (AI) to safeguard wildlife. Primarily, computer scientists were using AI and game theory to guess the poachers' haunts, and currently, they have applied AI and deep learning to detect poachers in near real-time.

Poachers are mostly active at night. While tools such as infrared cameras are used to track living organisms, since poachers and animals they are hunting both radiate heat, it is time-consuming and perplexing to track infrared video streams for poachers all night.

Therefore, a team of computer scientists led by USC Viterbi School of Engineering Ph.D. student Elizabeth Bondi in Professor Milind Tambe's lab, labeled 180,000 humans and animals in infrared videos using a labeling tool they created to accelerate the process. The researchers used these labeled images and leveraged a current deep learning algorithm called Faster RCNN that they adapted, to teach a computer to automatically differentiate between infrared images of humans and animals.

The challenge then was to set up this algorithm to detect poachers in near real-time using the laptop computers at base stations in the field, where recording is streamed from the drones that are being used to patrol national parks in Malawi and Zimbabwe. The algorithm, while running with accuracy, was requiring 10 seconds to process each image - which is too long for the moving vehicles. The goal was then to further alter the algorithm so it could be used by a standard laptop.

The researchers then improved the algorithm to work with Microsoft Azure - leveraging the power of the cloud to form a virtual computer that could perform faster processing. The team also created an alternative solution for spotty interconnectivity in rural areas so the software could function off of a laptop. The algorithm presently works to detect animals and poachers in just over three-tenths of a second.

This algorithm, currently christened "SPOT" or Systematic POacher deTector, will be set up on a large scale across Botswana.

SPOT will ease the burden on those using drones for anti-poaching by automatically detecting people and animals in infrared imagery, and by providing detections in near real time

Elizabeth Bondi, Lead Author and Ph.D. candidate in Computer Science - USC

Additional information about SPOT is available in the paper "SPOT Poachers in Action: Augmenting Conservation Drones with Automatic Detection in Near Real Time," by researchers at USC: Elizabeth Bondi, Milind Tambe, Ram Nevatia, Donnabell Dmello, Jongmoo Choi, Carnegie Mellon's Fei Fang, and Microsoft's Mark Hamilton and Lucas Joppa, together with researchers at drone non-profit Air Shepherd, Robert Hannaford and Arvind Iyer. The research paper was presented at the Association for the Advancement of Artificial Intelligence's 30th Conference on Innovative Applications of Artificial Intelligence.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Submit