ML suggests all that relaxing whale song might just be human-esque gossiping
Now this is our kind of click bait
A study into whale language using machine learning has uncovered a complex phonetic system, implying the cetaceans may speak to each other much like humans do.
The study, published this week in Nature and titled "Contextual and combinatorial structure in sperm whale vocalisations," was undertaken by MIT Computer Science and Artificial Intelligence Lab (CSAIL) researchers and Project CETI. The team used a relatively simple machine-learning algorithm to analyze the communicative sounds sperm whales make, which was previously thought to be a system of fixed messages.
The algorithm analyzed whale communication recordings from a sperm whale clan in the Caribbean Sea, and unearthed a complex sound system. The study found that codas – the patterns of clicks that form the basis of whale communication – are actually much more complex than previously thought.
We don't know what the whales are saying exactly, just that software has helped show that their chatter is more detailed than some have assumed and appears to be built on an alphabet of some kind. It may mean supposedly relaxing CDs of sperm whale song are in fact just hours of cetaceans moaning about the weather, or so we guess.
"Like the International Phonetic Alphabet for human languages, this 'Sperm Whale Phonetic Alphabet' shows how a small set of axes of variation give rise to the diverse set of observed phonemes (in humans) or codas (in sperm whales)," the paper observes.
While human sounds are categorized based on where they're made in the mouth, how they're made, and whether the vocal cords are vibrating, whales use a combination of rhythm and tempo. Other variables include rubato (the fine-grained variation of intervals between clicks) and ornamentation (the addition of an extra click to a coda).
Here's a key part from the paper:
We show that codas exhibit contextual and combinatorial structure. First, we report previously undescribed features of codas that are sensitive to the conversational context in which they occur, and systematically controlled and imitated across whales. We call these rubato and ornamentation.
Second, we show that codas form a combinatorial coding system in which rubato and ornamentation combine with two context-independent features we call rhythm and tempo to produce a large inventory of distinguishable codas.
Sperm whale vocalisations are more expressive and structured than previously believed, and built from a repertoire comprising nearly an order of magnitude more distinguishable codas.
One thing to bear in mind is that human sounds (also called phonemes) aren't wholly analogous to whale utterances. MIT researcher Jacob Andreas explained to The Register that "the physiological processes underlying them are quite different – a coda is much longer than a typical phoneme, much of this structure is temporal, and I don't think we can draw clear analogs between any of these features."
"The number of codas is in the same ballpark of the number of phonemes in some languages," Andreas observed, "but we don't actually know whether a coda is like a phoneme, or a word, or a sentence, or something else we don't have a name for."
Machine learning could be used to further analyze languages, animal and human
As said, discovering the inner workings of whale language phonetics on its own doesn't reveal much about what codas actually mean.
"There's a lot that goes on in these vocalizations (whales chorusing together) that's quite different from anything in human language," Andreas pointed out. "But definitively answering that question will require characterizing what information is carried by these vocalizations, which is the next big direction we're pushing on."
- Save the whales – with, uh, artificial intelligence?
- Russian-trained spy whale spooks Norwegian fishermen
- After demonstrating a facial recognition system that works on cows, moo-chine learning pioneer seeks growth funding
- Japanese sat tech sinks Sea Shepherd anti-whaling activists' hopes
While Andreas ascribed the conclusions of the study more to "really good visualizations" made by his co-author Pratyusha Sharma – rather than the "very simple" algorithm used in the paper – future studies on both animal and human language may be boosted by AI. One such use case is a general toolkit for analyzing the structure of unknown animal communication systems, which MIT researchers and Project CETI are actively investigating.
Although human communication systems are much better understood than animal ones (including whales), there are still unanswered questions that machine learning could help answer. For instance, whether there was one original human language, or if our brains have universal grammar – a biological basis for language theorized by former linguist Noam Chomsky. ®