An artificial intelligence model trained to recognize birdsong is now also able to identify whale calls. Developed by Google DeepMind, Perch 2.0 is a bioacoustic model initially trained on a vast archive of bird sounds and other land-based animals.
Transfer Learning in Bioacoustics
The ability of Perch 2.0 to adapt to marine sounds is an example of transfer learning, a technique that allows knowledge gained in one context to be applied to another similar one. In this case, the ability to classify bird songs proved useful for analyzing whale vocalizations.
The researchers converted five-second audio segments into spectrograms, visual representations of sound intensity over time and frequencies. These spectrograms were then processed by the model to extract salient features and distinguish, for example, between the whistles of a humpback whale and those of an orca.
Evolutionary Parallels
The success of Perch 2.0 in identifying whale calls may derive from evolutionary parallels between birds and marine mammals, which may have developed similar mechanisms of vocal production. Furthermore, large models trained on diverse datasets tend to generalize well even on specific tasks. Finally, the complexity of bird song classification may have pushed the model to recognize subtle acoustic features, also useful for the analysis of underwater sounds.
This approach could significantly accelerate marine bioacoustic research and contribute to whale conservation, allowing scientists to passively monitor populations and better understand the behavior of these animals.
๐ฌ Comments (0)
๐ Log in or register to comment on articles.
No comments yet. Be the first to comment!