Foundation Offers $10m For ‘Cracking The Code’ Of Animal Language

Think you can decipher the rhythmic clicks and whistles of dolphins or the grunts and alarm calls of monkeys? A foundation is offering big prizes for progress in communicating with animals.

Looking to prompt renewed efforts at decoding animal communication, a non-profit founded by an investor and a university are offering prizes — including a hefty $10 million — to teams that can figure out what animals are “saying.”

The Coller Dolittle Challenge for Interspecies Two-Way Communication is a collaboration between the Jeremy Coller Foundation and Tel Aviv University. (Yes, it’s named after that Dr. Dolittle.)

Entrants aren’t asked to come up with a Star Trek-like “universal translator” for animals. Rather, the people behind the Coller Dolittle Challenge want to see methods that allow for two-way communication between humans and individual species.

“We are open to any organism and any modality from acoustic communication in whales to chemical communication in worms,” said Yossi Yovel, a professor at Tel Aviv University and co-chairman of the challenge.

The grand prize is a $10 million grant or $500,000 in cash, chosen by the winner, while the Foundation will offer $100,000 prizes each year for the best entries that make significant progress toward communicating with animals. The yearly prizes will be assessed “for significant contributions to decipher, interface or mimic non-human organism communication.”

While it may seem far-fetched — and there are those who believe humans will never be able to fully understand animal communication in proper context — there have been efforts to communicate with and decode the communications of bats, dolphins, whales and some primate species. Scientists have also pushed the boundaries on understanding group communication, such as the coordination involved in avian murmurations.

orangutan on tree
Orangutans have demonstrated the ability to understand abstract concepts, like using money, rudimentary sign language, and have even deceived humans. One orangutan in the 1960s repeatedly escaped his zoo enclosure by hiding a small strip of metal in his mouth and using it to pick a lock. Credit: Klub Boks/Pexels

The organizers believe artificial intelligence will be the tool that ultimately helps crack the communication barrier, but entrants aren’t required to use AI. The technology is incredibly useful for tasks involving pattern recognition and sorting large amounts of data, both of which are important in this kind of work when researchers are tasked with analyzing thousands of audio samples or hundreds of hours of footage.

Alas, we don’t think the foundation will be interested in the Buddinese language, which boasts 327 different ways of demanding food and features a timekeeping and calendar system based on meals and naps. A short trill followed by a series of staccato meows, for example, means “I expect prompt service at salmon o’clock,” while a truncated meow ending with a scoff is used to indicate displeasure when a human napping substrate tosses too much during sleep.

Still, maybe we’ll dress it up to make it look properly academic and give the challenge a try. Those prizes could buy a lot of Roombas!

The Dividing Line Between Human And Animal Has Been Blurred Again As AI Reveals Startlingly Complex Whale Language

By unlocking the mysteries of how sperm whales communicate and demonstrating their impressive cognitive abilities, researchers hope to get people invested in the fate of these endangered animals.

Sperm whales are chatty.

Their language is markedly different from the deep cetacean moans associated with other whales, taking the form of Morse code-like clicks that boom through the ocean in a decibel range almost twice that of jet engines.

And while we’ve long known animals like monkeys assign specific meaning to short vocalizations varying from alarm calls to affirmations of social rank, sperm whale conversations can endure for an hour or more, with participants exchanging complex strings of clicks that vary depending on context, environment and even which pod family is speaking.

Sperm_whale_Tim_Cole_NMFS_crop
An aerial view of a sperm whale near the ocean surface. Credit: Wikimedia Commons

While artificial intelligence has been maligned over the past few years as people grapple with its rapid progress and potential for abuse, it remains the best tool we have for teasing out patterns that our human minds can’t discern, especially from large quantities of data.

With more than 9,000 recordings of sperm whales, Project CETI — Cetacean Translation Initiative, a non-profit effort to decode and translate sperm whale communication — had precisely the kind of huge data cache that AI excels at analyzing.

By feeding the recordings into specially trained machine learning algorithms, the research team was able to identify a wealth of new language patterns. While human languages are composed of quantized morphemes — prefixes, suffixes and root words — whale communication is broken down into sequences of clicks and pauses called “codas.”

Like Morse code, codas make a distinction between short clicks and long clicks. Sperm whales also vary the tempo of the clicks, which could represent inflection, “dialects” or concepts completely alien to the human mind.

“Some of what they’re doing might be totally different from our way of communicating and we’re probably never going to be able to fully grasp those differences,” Oregon State postdoctoral marine researcher Taylor Hersh told NPR.

sperm whale fluke
A sperm whale fluke visible above the surface of the ocean. Credit: Wikimedia Commons

Researchers believe the “inter-click intervals” — akin to ghost notes in music — may be as significant as the clicks themselves. Importantly, while human ears were able to identify and catalog some of the codas, the machine learning algorithms found many that human analysis missed.

That’s not surprising considering sperm whales — the loudest animals on Earth, capable of generating sounds up to 230 dB — took a much different evolutionary course and, as ocean-dwelling creatures weighing up to 90,000 pounds (40,800 kg) likely have a radically different sensorium compared to humans.

The comparisons to music go further than ghost notes.

“This study shows that coda types are not arbitrary, but rather that they form a newly discovered combinatorial coding system in which the musical concepts of rubato and ornamentation combine with two categorical, context-independent features known as rhythm and tempo, by analogy to musical terminology,” CETI’s team wrote on May 7 while unveiling the most recent study.

Sperm_whale_distribution_(Pacific_equirectangular)
Sperm whale distribution based on human sightings. Sperm whales freely travel the oceans except in cold, ice-packed environs. Credit: Wikimedia Commons

While people have used many abilities to mark the dividing line between humans and animals over the years — including the ability to use tools, experience emotions, and demonstrate self-awareness — human capacity for authentic language with syntax and context-dependent meaning was one of the stalwarts, standing the test of time as new research toppled the other dividers by showing animals do indeed use tools, experience rich emotions and have complex inner mental lives.

With this research, scientists are assembling a “sperm whale phonetic alphabet” that will make it easier to discern and catalog whale codas.

To be clear, there’s still a lot of work ahead before scientists can prove sperm whale codas are comparable to human definitions of language, but whether they strictly meet that definition may not matter. After all, it’s clear the clicks and pauses of whale codas are imbued with meaning, even if it remains elusive to us for the moment.

Indeed, “sperm whale communication has both contextual and combinatorial structure not previously observed in whale communication,” the team wrote.

Proving sperm whale codas are tantamount to human language isn’t the goal anyway. The team has two overriding priorities — decode the meanings behind the codas, and get the wider public invested in the fate of these endangered animals by showing they’re not so different from us.

“Our results show there is much more complexity than previously believed,” MIT AI lab director Daniela Rus told NPR, “and this is challenging the current state of the art or state of beliefs about the animal world.”