Unsupported Browser Detected

Internet Explorer lacks support for the features of this website. For the best experience, please use a modern browser such as Chrome, Firefox, or Edge.

Using Artificial Intelligence To Identify Endangered Beluga Whales

March 24, 2020

Scientists partnered with tech industry experts to develop the first machine learning application for acoustic monitoring of Alaskan beluga whales.

Cook_Inlet_beluga_whales.jpg

An innovative machine learning application will help scientists collect information essential to protect and recover the endangered Cook Inlet beluga whale population. 

In 1979, the Cook Inlet beluga population began a rapid decline. Despite being protected as an endangered species since 2008, the population still shows no sign of recovery and continues to decline. 

Beluga whales are vulnerable to many threats such as pollution, extreme weather, and interactions with fishing activity. A special concern is underwater noise pollution, which interferes with the whales’ ability to communicate, navigate, and find food. This is a particular problem for the Cook Inlet population, which lives in Alaska’s most densely populated region. The area supports heavy vessel traffic, oil and gas exploration, construction, and other noisy human activities.

To effectively support the recovery of the Cook Inlet population, managers need to know how belugas use habitat seasonally and what threats they face. 

Acoustic Monitoring

Passive acoustic monitoring provides vital information on beluga movement and habitat use. It also helps scientists identify where noise may be affecting beluga behavior, and ultimately, survival.

Scientists listen for belugas using a network of moored underwater recorders. These recorders collect enormous volumes of audio data including noise from the natural ocean environment, human activities, and other animals, as well as beluga calls.

To detect potential beluga signals in these sometimes cacophonous recordings, scientists have traditionally used a series of basic algorithms. (An algorithm is a step-by-step process to solve a math problem.) However, these algorithms don’t work as well in noisy areas. It’s hard to distinguish faint beluga calls from signals like creaking ice, ship propellers, and the calls of other cetaceans like killer and humpback whales. Until now, it required months of labor-intensive analyses by scientists to remove the false detections and correctly classify beluga calls. 

 

Listen to a sound clip of beluga calls from moored acoustic recorder

 

Beluga whale call Spectrogram image.

Spectrogram showing the 15 seconds of beluga calls head in the sound clip.

Deep Learning

This year, for the first time, scientists are working with Microsoft artificial intelligence (AI) experts to train AI computer programs. The programs will perform the most tedious, expensive, and time-consuming part of analyzing acoustic data: classifying detections as beluga calls or false signals. 

The team is working with a type of AI called “deep learning.” Deep learning imitates the way the human brain processes data and uses it for decision making.

“Deep learning is as close as we can get to how the human brain works,” said Manuel Castellote, NOAA Fisheries affiliate with the University of Washington, Joint Institute for the Study of the Atmosphere and Ocean, who led the study. “And so far the results have been beyond expectation. Machine learning is achieving more than 96 percent accuracy in classifying detections compared to a scientist doing the classification. It is even picking up things human analysts missed. We didn’t expect it to work as well as humans. Instead, it works better.”

The machine learning model is not only highly accurate, but can process an enormous amount of data very quickly. “A single mooring dataset, with 6-8 months of sound recordings, would take 10-15 days to manually classify all the detections,” Castellote explains. “With machine learning tools, it is done overnight. Unsupervised.”  

There is a network of 15 moorings in Cook Inlet, which we deploy and retrieve twice a year. The machine learning model will mean a huge savings of time and money in analyzing the acoustic data we retrieve from these moorings.

“Remote sensors, like acoustic moorings, have revolutionized our ability to monitor wildlife populations, but have also created a backlog of raw data that has ecologists spending more time clicking than conducting research. Work like this makes scientists more efficient, so they can get back to doing science instead of labeling data,” said Dan Morris, principal scientist on the Microsoft AI for Earth team.

Most importantly, it means managers get the accurate information they need   much more quickly.

“This project, promoting biodiversity by helping beluga whales, speaks to the core of what AI for Good is about—enabling scientists to make the world a better place,” said Rahul Dodhia, senior director of data science for Microsoft’s AI for Good team.

A Major Advance

The use of machine learning for image analysis has been well established for years. But it is only very recently that it has begun to be used in bioacoustics.  

“This is definitely the first time machine learning has been applied to acoustic monitoring of belugas, and one of the few efforts to date for any cetaceans,” Castellote said. “Now we are working to optimize it.”

The team is now using the machine learning only for part of the analysis: classifying detections. The next step is to teach it to both detect and classify beluga calls. They hope to accomplish that by summer 2020.

Castellote points out that this first step in using machine learning to classify beluga  signals has much broader applications. The Microsoft AI team has developed a tool that can be customized to many other species. 

“Normally when we advance in our analysis methods, it is in little steps,” said Castellote. “This is a major step to a new and very different approach.”

Beluga whale acoustic signal classification using deep learning neural network models


This research is a collaboration between NOAA Fisheries’ Alaska Fisheries Science Center Marine Mammal Laboratory, Microsoft AI for Good, and the University of Washington’s Joint Institute for the Study of the Atmosphere and Ocean.