Posted in | News | Machine-Vision

AI Monitors the Health of Coral Reefs by Learning the “Song of the Reef”

New research indicates that by studying the “song of the reef,” artificial intelligence (AI) can assess the health of coral reefs.

AI Monitors the Health of Coral Reefs by Learning the “Song of the Reef”.

Image Credit: Shutterstock.com/ Shirley W images

Even experts have to do rigorous analyses to determine reef health based on sound recordings because coral reefs have a complex soundscape.

University of Exeter scientists used several recordings of healthy and degraded reefs to train a computer algorithm, allowing the machine to understand the difference.

After that, the computer analyzed a slew of additional recordings and correctly recognized reef health 92% of the time.

This was used to keep track of the progress of reef restoration projects by the team.

Coral reefs are facing multiple threats including climate change, so monitoring their health and the success of conservation projects is vital. One major difficulty is that visual and acoustic surveys of reefs usually rely on labor-intensive methods. Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings.

Ben Williams, Study Lead Author, College of Life and Environmental Sciences, University of Exeter

Our approach to that problem was to use machine learning—to see whether a computer could learn the song of the reef. Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing,” added Williams.

Coral reef fish and other species produce a wide variety of noises.

The significance of many of these cries is unknown, but the new AI system can tell the difference between good and unhealthy reefs’ overall sounds.

The recordings utilized in the study were made at the Mars Coral Reef Restoration Project in Indonesia, which is working to restore severely damaged reefs.

Dr. Tim Lamont of Lancaster University, a co-author, believes the AI technique will greatly enhance coral reef monitoring.

This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working. In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it—especially in remote locations.

Dr. Tim Lamont, Study Co-Author, Lancaster Environment Centre, Lancaster University

The Natural Environment Research Council and the Swiss National Science Foundation financed the research.

The study was reported in the journal Ecological Indicators.

Journal Reference:

Williams, B., et al. (2022) Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning. Ecological Indicators. doi.org/10.1016/j.ecolind.2022.108986.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.