What if we could understand the language of another species—one with its own culture, dialects, and deep intergenerational bonds? David Gruber, founder and President of Project CETI, shares how his team is using advanced machine learning and state-of-the-art gentle robotics to translate the clicks and codas of sperm whales.
The following is an edited transcript from David’s presentation at Bioneers 2025.

Project CETI—the Cetacean Translation Initiative—is the largest interdisciplinary effort ever undertaken to translate the language of sperm whales. For the past five years, we’ve been working in the Eastern Caribbean, off the coast of Dominica.
But first, a bit about how I got here. I originally trained as a marine microbiologist, and I’ve always loved life in all its forms. As a kid, I was obsessed with ants—watching their little societies for hours. I remember learning about E.O. Wilson and his massive book The Ants, and thinking, Wait, you can actually make a living doing this? It was a total lightbulb moment.
Before whales, I spent years trying to get people to see how strange and amazing different animals are—jellyfish, for example. When you swim with them, you realize they’re incredibly sentient. I’ve spent a lot of time with jellyfish, as my background includes a lot of work in coral reef ecology and jellyfish are cnidarians, related to corals and sea anemones.
I took that curiosity to an intense level. But as I progressed as a marine scientist, I started to realize how disconnected the work could feel. On expeditions, we’d pull animals out of the ocean and watch them gasp for their final breaths on the deck of the boat—all while excitedly identifying new species. For a sensitive kid, it was jarring. The idea that studying an animal often meant killing it never sat right with me.
So one of the core themes of our work became: How can we study animals without harming them? That question sparked a long-standing collaboration with Rob Wood at the Harvard Microrobotics Lab—now over a decade strong. Together, we developed the gentlest robot ever created, capable of interacting with jellyfish using just one-tenth the pressure your eyelid applies to your eye.
I became increasingly obsessed with designing tools that could study these delicate animals without harming them.
I became increasingly obsessed with designing tools that could study these delicate animals without harming them. One example was an origami-inspired, rotary-actuated dodecahedron—a robotic structure we used to gently encase jellyfish in the deep sea for observation.
Now, we’re taking it even further. In collaboration with the Schmidt Ocean Institute, we’re preparing for our next expedition, Designing the Future 3. This project allows us to study jellyfish-like creatures, including siphonophores, using cutting-edge tools: 3D scanning, gentle robotic swabs to collect genomic data, and the creation of what we call digital holotypes—comprehensive, non-destructive records of individual specimens.
This approach stands in stark contrast to how the deep sea is being treated elsewhere. On one hand, we see efforts to mine the ocean floor, mowing down fragile ecosystems in pursuit of rare earth minerals like magnesium. On the other hand, a growing group of scientists is going to extraordinary lengths to study gelatinous life without causing harm.

Another theme that’s been central to this work is learning to see from the perspective of the other. That “other” might be sharks, or biofluorescent animals—creatures that absorb the blue light of the ocean and emit it in brilliant, unexpected colors. My earlier research focused on them. It helped me realize just how much we share this planet with a more-than-human world, and how little we really understand about how other animals perceive, feel, and experience their environment.
As part of that journey, we encountered the swell shark—not exactly the most charismatic species at first glance. But when viewed under blue light, through a lens designed to mimic a shark’s eye, something incredible appeared: intricate patterns across its body. Even more fascinating, the patterns differed between males and females.
That discovery launched us into several years of work, designing a “shark-eye” camera to see the world the way a shark might. Everyone in my lab became obsessed with this project—and with this unassuming little shark. We used every tool we had, combining Western technology and creative design to try to see the ocean from the shark’s perspective.
Now, I’m honored to serve as a steward of the Cetacean Translation Initiative. With CETI, we’re focusing on sperm whales—fellow mammals, yet vastly different from us. They’re often called the poster species of macroevolution, and for good reason.

Sperm whales are deeply social animals, living in close-knit family groups made up of grandmothers, mothers, and calves. Off the coast of Dominica, there’s a matrilineal population of about 200 sperm whales that remain in the region nearly year-round.
Shane Gero, one of our collaborators, knows these whales so well that he can identify individuals by just a glimpse of their tail. He’ll say, “That’s this whale, and it’s related to that one.” It’s incredible. This kind of deep, long-term human observation and care is absolutely essential to the work.
Editor’s note: Check out this video of Project CETI Biology Lead, Dr. Shane Gero, discussing his research on sperm whale communication and culture.

Project CETI officially launched in 2020 with catalytic support from the TED Audacious Project. We raised $33 million to get it off the ground, and today the initiative includes a team of 50 scientists. We’ve built a 20-kilometer by 20-kilometer underwater listening and recording studio off the coast of Dominica. Of course, there’s no store where you can pick up “whale listening tech”—we had to design and build everything from scratch.
The spark for the project actually began in 2018, when I was a Radcliffe Fellow at Harvard. I was sharing space with 50 other fellows from a wide range of disciplines. One of them was Shafi Goldwasser, a Turing Award winner and professor at Berkeley. At the time, her team was working on aspects of Google Translate. They were discussing how it could learn to translate between human languages—not by using a Rosetta Stone, but by analyzing the mathematical shapes of languages in multi-dimensional space. So I played them these recordings of the sounds made by sperm whales.
In sperm whales, one of their two nostrils has evolved into a blowhole, but if you examine a whale skeleton, you’ll still see the second, much smaller nostril. When whales vocalize, they move air back and forth through the structures in their head. That air travels through several hundred liters of spermaceti oil. (The name sperm whale unfortunately comes from whalers who mistakenly believed the oil was part of the reproductive system. One of our long-term goals is to rename the species through this collaboration.)
The sound then passes through a series of waxy structures that allow the whale to focus it very precisely. They use this sound in two main ways. One is echolocation—essentially seeing with sound in the deep sea. As a whale dives, you’ll hear a steady pattern: click, click, click, click. As it approaches prey, the clicks speed up—faster, faster—until there’s a final gulp. They’re particularly good at hunting squid.
At the surface, though, sperm whales use a different kind of sound called codas. These are rhythmic click patterns used to communicate. One of the most common in Dominica is a three-part pattern: click, click, click-click-click—we call it “1-1-3.” Remarkably, we didn’t even know sperm whales made sounds until the 1950s. Shane Gero’s research revealed that they actually have regional dialects. Among the whale clans in Dominica, for example, each clan has its own unique dialect, kind of like different accents, say British and Scottish, even though they live in the same waters.

This project is deeply inspired by past efforts—and by the human imagination itself. That sense of possibility fuels us. We often think back to the words of Carl Sagan and others who, while looking out into distant galaxies, also wondered about the mysteries right here on Earth. One idea that stays with us is the question: Could the intelligence of cetaceans be expressed in something like epic poetry, oral history, or intricate codes of social interaction?
Are whales and dolphins the equivalent of human Homers before the invention of writing, recounting great deeds from the far reaches and deep depths of the sea? Who knows? But it’s a beautiful idea—and one that motivates our work.
Dominica is the heart of this project, in part because Shane Gero has been working there for over 20 years. But also, the geography is uniquely suited to this kind of research. It’s like a volcanic Jurassic Park rising from the ocean, with waters that become incredibly deep just offshore. That means whales can swim close to land, unlike in most places where you’d need to go far out to sea to find them. And the population here is remarkably stable—many of the whales remain year-round.
And here we are—on a planet with sperm whales. There are still a few hundred thousand of them alive today, communicating in extraordinary ways we’re only just beginning to understand. We’re barely scratching the surface.
Around 2019, just before we received funding from TED Audacious, we had a breakthrough realization: this underwater recording studio we’d dreamed of? It was actually possible. Humanity had already invented all the necessary technology. We could do this. We could translate the language of sperm whales. It suddenly felt within reach, like the moment people first looked at the moon and thought, Could we really go there?
And here we are—on a planet with sperm whales. There are still a few hundred thousand of them alive today, communicating in extraordinary ways we’re only just beginning to understand. We’re barely scratching the surface.

Project CETI brings together eight different disciplines to tackle this monumental challenge. We have teams specializing in machine learning, robotics, natural language processing, network science, marine biology, and underwater acoustics. And we also have a legal team, which, honestly, might be the most important of them all. As this work unfolds, every piece—every discipline—matters.
One of our biggest breakthroughs this past year was identifying what we believe to be the sperm whale’s phonetic alphabet. That discovery is largely thanks to Jacob Andreas, a professor of natural language processing at MIT, and Pratyusha Sharma, a graduate student in his lab. Their work builds on years of collaboration with our trusted advisor, the late Roger Payne, and on tens of thousands of click recordings collected by Shane Gero.
What we’ve found is remarkable: the whales’ vocalizations appear to contain structured elements—almost musical in nature. Tempo, rhythm, even something we call rubato—subtle changes in timing. One of the most fascinating discoveries is a feature we’re calling ornamentation: small variations, like the addition of an extra click—click, click, click-click-click, click. At first, you might think it’s just noise or a mistake. But when you analyze tens of thousands—or even millions—of these codas, you begin to see patterns. Those subtle differences matter. They’re part of a complex system—perhaps even a language.
This coming year, we have so many exciting developments on the horizon. At a recent event at the Simons Institute for the Theory of Computing, we attended a talk on elephant communication by Joyce Poole and Mickey Pardo. They shared research suggesting that elephants may use names—and even more astonishing, that one elephant might say something and receive a response 15 minutes later.
That kind of delay would be considered rude in human conversation. If I sat silently for 15 minutes before replying to you, half the audience would probably walk out. We’re so accustomed to rapid-fire, back-and-forth banter, where interrupting is rude and pauses are awkward. But that’s not how communication works for elephants—and likely not for whales either. It requires a completely different frame of reference.
In our work with sperm whales, we’ve started analyzing the “negative space” between clicks—the silences—and we’re finding vowel-like features that may represent a whole new layer of their communication system. It’s just the beginning, but it’s incredibly promising.

At CETI, we hold ourselves to a strict ethical philosophy: We never draw a drop of blood. While other researchers may collect DNA samples by taking small plugs of skin, we’ve made a deliberate choice not to. We take the extra time to care for the whales and always ask ourselves one question before moving forward: Is this work in service of the whale?
What’s so exciting—and sobering—is that these new technologies, like AI, are beginning to be applied to the study of animals. And they hold extraordinary potential. This moment could be as transformative as the invention of the telescope or the microscope.
Karen Bakker, author of The Sounds of Life, describes this beautifully: She likens the combination of AI and bioacoustics to a new kind of scientific instrument—one that can help us perceive what our unaided, Old World primate ears cannot. Just as telescopes opened up the cosmos and microscopes revealed the hidden world inside cells, these tools may allow us to hear and interpret the voices of other species.
A world of wonder, connection, and possibility awaits. But how we move into that world matters.
At CETI, we’re working from the hypothesis that technology can deepen our connection to the natural world. It’s still a question mark. But it’s one we’re pursuing with care and humility.
Curious about the ethical side of this work?
If decoding the language of sperm whales and other animals is now within reach, what responsibilities come with that power? In a companion article, legal scholar César Rodríguez-Garavito explores the ethical and legal questions raised by this emerging science—and what it means to protect the more-than-human world in the age of AI.
More on this Work from Project CETI:
- What If We Understood What Animals Are Saying? The Legal Impact Of AI-assisted Studies Of Animal Communication (SSRN) | What happens when we understand what other species are saying? Published on April 23, 2025, his paper explores how advances in artificial intelligence and bioacoustics could reshape nonhuman animal law, offering a provocative look at how decoding animal communication might challenge the boundaries of rights, protections, and personhood. Read an exploration of this paper from National Geographic here.
- Researchers are Using AI to Understand what Animals are Saying (TIME) | This article, written by David Gruber and César Rodríguez-Garavito, explores how artificial intelligence and new technologies are unlocking the hidden languages of nonhuman animals—from whales to elephants—and the profound legal, ethical, and ecological questions that follow. It asks whether these breakthroughs will lead to deeper empathy and protection for other species—or risk new forms of exploitation if not guided by care and precaution.