AI could soon help us talk to animals
Two neurobiologists discuss the answer in a new essay published in Current Biology. In homage to the Turing test – which provides a threshold for human intelligence in machines – Yossi Yovel and Oded Rechavi from Tel Aviv University in Israel introduce what they have coined the "Doctor Dolittle challenge".
The challenge requires an AI-based, large 'language' model to overcome three main obstacles when communicating with an animal.
- It has to use the animal's own communicative signals. The animal must not learn new signals, like a dog responding to human commands.
- It has to use these signals in a variety of behavioral contexts, not just during courtship or threatening situations, like scientists playing a known avian alarm call back to birds.
- And it has to produce a measurable response in the animal "as if it were communicating with a conspecific [an animal similar to itself] and not a machine."
Take the honey bee, for instance. It performs a waggle dance to communicate with the colony about the location of food. Scientists have managed to 'hack' this knowledge and create a robotic bee that can recruit other bees with its moves and lead them to a specific spot.
This fulfills the Doctor Dolittle challenge on the first and third point. But the dance only works for this one context. Scientists still can't ask a bee what it wants or how it's feeling. Plus, even if all three of the above boxes are ticked, we may never be able to communicate with animals at the level many pet owners or animal lovers would want.
While an algorithm might one day be able to tell us that our pet cat is expressing love or frustration, there might be no way to ask how it is feeling. Human language may simply be unique in ways that do not extend to other animals. Our 'self-centered world', or umwelt, limits all that we comprehend.
As the philosopher Ludwig Wittgenstein argued, "even if a lion could speak, we could not understand it." If Wittgenstein is right, "we will never be able to ask [a cat] 'how they feel' or explain that ChatGPT already means CatGPT in French (and that it might be funny)," write Yove and Rechavi.
That bee dance we thought we'd mastered? It probably contains way more information than we've noticed, Yovel and Rechavi note, "including subtle tactile and acoustic signals about the quality of the resource". "These data would also need to be collected and fed into the AI algorithm if it were asked to crack the code, but we are not even sure which other types of data would need to be recorded," the authors add.
Would we need to record electric fields, too?
Mastering primate communication, on the other hand, might be easier, as it's closest to our own. But AI models would still need to be trained on a huge amount of data that would require long-term surveillance of primates in the wild. Where would that information come from?
Even if it could be gathered and utilized, scientists would need to measure a 'natural response' from primates, indicating that they had heard and understood a machine's attempt to communicate with them.
Neural recordings might help with this, but in some cases, proving objective understanding may be downright impossible. In the future, Yovel and Rechavi think AI can be harnessed to better understand animal communication, but they admit it may not be able to help us communicate with animals like Doctor Doolittle.
The neurobiologists say that even if the power of AI increases "a million fold", some of the obstacles that currently stop us from talking to animals will remain. "Even if we will never be able to talk to animals in the human way, understanding how complex animal communication is and attempting to tap into it and mimic it is a fascinating scientific endeavor," the researchers conclude.
"We thus call on scientists to apply AI to decipher animal communication according to the Doctor Dolittle challenge criteria… "
Carly Cassella