For centuries, the idea of speaking with animals has belonged to the realm of myth and fiction. But now, it’s fast becoming reality. Fueled by artificial intelligence and ambitious research, we’re closer than ever to decoding the languages of whales, dolphins, and perhaps many more species. So what happens when we finally understand what animals are saying?
The New Linguistic Frontier
The race to unlock animal communication is intensifying. With $10 million on the table from the Jeremy Coller Foundation for whoever succeeds first, research teams around the world are working to crack the code—and the technology behind them is evolving at breakneck speed.

Generative AI and large language models are the latest tools enabling scientists to sift through millions of animal vocalizations, seeking patterns that hint at grammar, meaning, and even identity. Most efforts are concentrated on cetaceans—like dolphins and sperm whales—whose vocal learning and complex communication systems mirror our own in uncanny ways.
Project CETI (Cetacean Translation Initiative), for instance, is decoding the high-speed “codas” of sperm whales—clicks as brief as one-thousandth of a second. Already, the team has identified potential markers for individual names, turn-taking in conversations, and even a “punctuation” click. Their goal: to speak whale by 2026.
Dolphins, Dialects, and AI Breakthroughs
In parallel, Google’s DolphinGemma is diving deep into dolphin chatter, trained on 40 years of acoustic data. These studies aren’t just theoretical—real-world success stories are emerging. In one remarkable case, dolphins learned to associate a specific click with sargassum seaweed, a sound they later used independently. This may be the first recorded instance of a word crossing the species boundary.

In Florida, a dolphin named Zeus learned to mimic the human vowels A, E, O, and U. In Alaska, a humpback whale named Twain engaged in a 20-minute acoustic back-and-forth with scientists—an exchange that felt more like a conversation than a coincidence.
A Planet That Speaks—And Screams
But while our ability to listen is growing, we’ve long ignored what nature has been telling us.
Oceans were once rich in acoustic life. Healthy coral reefs crackle with fish and shrimp. Whale songs travel for miles. Yet today, our technology is silencing theirs. Industrial noise—shipping, drilling, deep-sea mining—has raised ocean sound levels by roughly three decibels per decade since the 1960s. Humpback whales, whose haunting songs can last for 24 hours and evolve through “song revolutions,” are being drowned out by human noise. Some stop singing altogether when a ship passes as far as 1.2 km away.
This isn’t just noise pollution—it’s cultural erasure. Whale songs carry vital information about migration, mating, and identity. Silencing them means silencing entire societies.
Translation Isn’t Just About Sound

Even as we race toward speaking with animals, there’s a deeper challenge: truly understanding them. Communication isn’t only acoustic. Animals use gestures, body language, chemical signals, and senses we barely comprehend—like echolocation. To “speak” with another species means entering their umwelt—their unique sensory world.
German ecologist Jakob von Uexküll coined this concept to describe how every species experiences a different reality. AI may help us mimic sounds, but true translation demands empathy, imagination, and a shift in how we see ourselves.
As philosopher Stephen Budiansky rephrased Wittgenstein, “If a lion could talk, we probably could understand him. He just would not be a lion any more.” Talking to animals may ultimately change us more than them.
Lessons from the Stars—and the Seas
It’s no accident that Project CETI echoes NASA’s SETI—the Search for Extraterrestrial Intelligence. One SETI team even helped record the conversation with Twain the whale, believing that interspecies translation on Earth might teach us how to speak with aliens.

In the film Arrival, a linguist learns to communicate with heptapods, whale-like aliens whose language reshapes her perception of time. The concept draws from the Sapir-Whorf hypothesis: that language doesn’t just reflect thought—it shapes it.
Indigenous languages like those spoken in Pormpuraaw, Australia, reflect this beautifully. There, people describe time spatially—east to west—embedding their environment into every sentence. Whale songs, similarly, are rooted in the rhythms of the sea, in cycles of migration and deep time.
Imagine how our thinking might change if we adopted that sense of time and space. What if we truly listened to whale song—not just to translate, but to understand?
What Happens When We Understand?
We are close to breaking a barrier that has separated species for millennia. But the real question isn’t whether we can speak with animals—it’s whether we’ll choose to listen.

Nature is already speaking to us, in clicks and whistles, in silence and in song. The tragedy is that even now, as the chorus grows clearer, we continue to silence it.
Understanding animal languages won’t just give us new knowledge—it could fundamentally change our ethics, our behavior, and our place in the world. It might teach us not just how to talk, but how to be quiet. How to listen. How to care.
We’re close to translating animal languages—what happens then? Maybe we finally learn to be human again.

Leave a Reply