Brainwaves rock! Scientists decode Pink Floyd tune straight from the noggin
First up: Another Brick in the Wall, Pt. 1
A group of scientists say they are the first to reconstruct a recognizable song from data collected directly from the brain by monitoring electrical activity and modeling the resultant patterns with regression-based decoding models.
That song is Pink Floyd's "Another Brick in the Wall, Pt. 1" and the question on the mind of the researchers was:
What information about the outside world can be extracted from examining the activity elicited in a sensory circuit and which features are represented by different neural populations?
The research appeared on Tuesday in PLOS Biology. The 190.72 seconds of the song were delivered through in-ear monitor headphones at a volume individually comfortable for the 29 patients between 2009 and 2015.
Because all patients had pharmacoresistant epilepsy, they were each already equipped with electrodes placed on the surface of their brain to look for seizures before the team's research even began. Collectively the group had 2,668 electrodes.
By playing the music and recording the output, the electrodes provided an intracranial electroencephalography (iEEG) data set the boffins examined to determine the parts of the brain stimulated by certain frequencies within the song.
Of the electrodes, 347 ended up being significant in encoding the song's acoustics. They were concentrated in three distinct regions, with a higher proportion skewed toward the right side of the brain. The 16th-note rhythm guitar pattern present in the song particularly activated a certain region of the brain in the temporal lobe – a section linked with processing auditory information and encoding memory.
"These results are in accord with prior research, showing that music perception relies on a bilateral network, with a relative right lateralization," wrote the study authors.
- Roger Waters tells Facebook CEO to Zuck off after 'huge' song rights request
- Researchers discover algorithm to create shapes that roll down pre-determined paths
- Artificial General Intelligence remains a distant dream despite LLM boom
- Idea of downloading memories far-fetched say experts after Musk claim resurfaces in latest Neuralink development
The reconstruction of the song involved training 128 models to each restore a frequency that was eventually assembled into a whole. Previous studies had done similar with speech but didn't take on enough nuances to piece together music.
"Music is core to human experience, yet the precise neural dynamics underlying music perception remain unknown," said the researchers. Indeed, music is layered with complexity. Notes, rhythm, harmony, and chords all play a role in evoking an emotional response to music in addition to the phonemes, syllables, words, semantics, and syntax that speech offers.
"Another Brick in the Wall, Pt. 1" was a particularly apt song choice. Its crafted instrumentals complemented by a comparatively sparse 41 seconds of lyrics make it what the boffins described as a "rich and complex auditory stimulus."
Plus, it was reportedly tolerable to the oldies in the group.
The reconstructed bits are available to listen to for anyone who doesn't find it creepy to listen to classic rock as interpreted in someone else's head.
Once that hurdle is passed, those with an ear for Pink Floyd's 11th album can pick up a distorted version of the melody, an appropriate rhythm and even some lyrics.
Should there have been more electrodes or had they been more precisely placed, the researchers believe the reconstructed strong would likely be even clearer. Further work could achieve this, as well as vary the decoding models' features and targets or add a behavioral dimension.
Applications of such research could help those struggling to communicate due to paralysis from stroke, diseases like ALS, or other reasons by better artificially replicating elements of speech than current similar technologies allow. ®