Whales are one of the most mystical animals on Earth. With some species reaching over 100 feet and weighing upwards of 200 tons, their size is something that most people cannot even begin to fathom. Despite nearly 50 years of protection efforts, many whale species remain on the endangered list due to centuries of hunting in combination with climate change, shipping activity, and pollution. Scientists have been working tirelessly to learn as much as possible about these underwater giants in order to determine the best ways to protect them. However, whales spend very little time at the surface, and this elusive nature makes it difficult to observe their behavior. For many scientists, the solution lies in the sound waves under the sea.
Whales make vocalizations known as “calls” that serve a variety of purposes, from communication to migration. Recordings of whale calls can provide researchers with information about behavior, migration habits, population structure, and more. The Pacific Island Fisheries Center of the U.S. National Oceanic and Atmospheric Administration (NOAA) has placed underwater microphones called High-frequency Acoustic Recording Packages (HARPs) at 13 different sites around the Pacific Ocean to continuously record the sounds of the ocean. Some of these HARPs have been in place since 2005 and NOAA now has nearly 170,000 hours of recording. While this technology has been enormously beneficial to researchers, this audio would take over 19 years to listen to straight through!
To sort through this enormous amount of data, NOAA has partnered with Google’s Artificial Intelligence for Social Good Program. The goal of their collaboration is to develop a neural network that can automatically identify calls and match them to the corresponding whale species. NOAA Oceanographer, Ann Allen, explains that this methodology would be similar to programs, like Shazam, that identify titles and artists of songs. The artificial intelligence algorithms they developed are actually able to “learn” what each whale call sounds like and how to differentiate it from other sounds. This is important because there is a large amount of background noise in the ocean from things such as waves, other creatures, and human activities. In addition to this, the algorithm must be able to recognize calls that may not always be exactly the same. While some whales have consistent and easily recognizable calls, others, like the humpback whale, vary the tone and phrasing of their calls over time or between populations.
NOAA and Google were able to teach the algorithm to identify humpback whale calls by providing it with many examples of calls that had already been labeled by humans. Rather than listening to the audio, the computer detects variations in sound from spectrograms, or visual representation of sound waves. While this system hasn’t been perfected yet, the team program improves with every example call it is provided and will continue to improve over time.
The results of this collaboration will allow scientists to answer some of the most pressing questions about endangered whales. Additionally, because they have nearly 14 years of audio, the researchers will be able to understand how whale populations have changed over time. In the future this technology may be used to identify other calls, such as those from dolphins, and could be implemented all over the world.
What are some other ways that artificial intelligence technologies could be used to benefit the environment?
Latest posts by Ashley Reaume (see all)
- Saltwater Brewery: Turning an Environmental Crisis into a Sea Turtle’s Snack - February 10, 2019
- Google Under the Sea - January 23, 2019
- Success in South Africa: Twenty New and Expanded Marine Protected Areas - December 9, 2018