Google wants to talk to dolphins thanks to AI: a science fiction dream soon reality?

By: Elora Bain

They are among the most intelligent animals on the planet, and for good reason: dolphins know how to cooperate, bind new skills and can recognize themselves in a mirror. These marine mammals even have a complex language which is their own, based on whistles and clicks, but which remains difficult to understand for scientists.

A challenge that could soon be taken up, if an article by Ars Technica is to be believed. Researchers could soon be able to decipher the language of cetaceans thanks to an artificial intelligence (AI) developed by Google. The objective is ambitious: create an AI capable of communicating with them. But is it only possible?

To achieve this unprecedented performance, Google collaborates with the Wild Dolphin Project (WDP), which has been studying the Atlantic spotted dolphins since 1985. The group of specialists has recorded and films these underwater creatures and analyzes their behavior. If they have so far succeeded in associating certain activities and behaviors with specific sounds which they emit – notably whistles, similar to names, to find themselves – a large part of this language remains mysterious for the team.

WDP researchers believe that it is necessary to understand the structure and patterns of dolphins vocalizations to determine whether their communication reaches the level of a language, or not. “We do not know if animals have words”explains Denise Herzing of WDP. By developing an AI with Google, called Dolphingemma, the long -term objective would be to communicate with dolphins – if a real language exists between them.

Refer this AI to other species

This AI model is based on an audio technology developed by Google, called Soundstream, to code the vocalizations of dolphins, which makes it possible to introduce the sounds in the model as they recording. Like any AI model, the latter must indeed be trained, and nourished by the greatest number of possible data, to then be able to analyze them, reproduce them, and why not establish, in the long term, the structure of a language.

Google claims to have trained its model using the acoustic archives of the Wild Dolphin project. The big language model should be able to receive a whistle from dolphin, to predict what sound should follow it, and to model it. A bit like when the search engine or your smartphone keyboard guess the end of the sentence you are writing. If everything goes well, it could give man the means to make himself understood by the animal.

Although Dolphingemma has been trained on the sounds of the Atlantic spotted dolphins, Google believes that it should be possible to refine it in an attempt to communicate with other cetacean species. Strongly a great debate between the porpoises and the humpback whales.

Elora Bain

Elora Bain

I'm the editor-in-chief here at News Maven, and a proud Charlotte native with a deep love for local stories that carry national weight. I believe great journalism starts with listening — to people, to communities, to nuance. Whether I’m editing a political deep dive or writing about food culture in the South, I’m always chasing clarity, not clicks.