Advertisement

healthHealth and Medicinehealthneuroscience
clockPUBLISHED

Brain Activity Translated Into Snippets Of Classic Rock Song In Breakthrough Study

The sound quality might not be quite ready for The Great Gig in the Sky, but you should still take some Time to listen.

Laura Simmons - Editor and Staff Writer

Laura Simmons

Laura Simmons - Editor and Staff Writer

Laura Simmons

Editor and Staff Writer

Laura is an editor and staff writer at IFLScience. She obtained her Master's in Experimental Neuroscience from Imperial College London.

Editor and Staff Writer

comments2Comments
share360Shares
3D rendering of human brain with musical notation coming out

We were so excited when we first heard these recordings, we had A Momentary Lapse of Reason.

Image credit: goa novi/Shutterstock.com

Music is such a central part of what it means to be human, yet there’s so much scientists don’t know about what goes on in our brains when we listen to our favorite tunes. Now, a study has broken new ground by showing that it is possible to reconstruct a song that someone was hearing from only their brain activity patterns – and if you think this sounds like sci-fi, you can take a listen for yourself.

Beyond a greater understanding of how the brain perceives music, there’s another strand to this research. Brain-computer interfaces are advancing all the time. For people who have lost the ability to speak due to a brain injury or illness, there are devices that can help them to communicate, such as the one used by the late Stephen Hawking

Advertisement

Versions of these devices, sometimes referred to as neuroprostheses, have been developed to allow people with paralysis to type text by imagining hand-writing it, or to spell out sentences using just their thoughts. But, when it comes to speech, one thing that’s been notoriously hard to capture is the rhythm and emotion behind the words, called prosody. The best we’ve been able to do comes out sounding distinctly robotic.

“Right now, the technology is more like a keyboard for the mind,” said lead author Ludovic Bellier in a statement. “You can't read your thoughts from a keyboard. You need to push the buttons. And it makes kind of a robotic voice; for sure there's less of what I call expressive freedom.”

The team behind the new study looked to music, which naturally includes rhythmic and harmonic components, to try to create a model for decoding and reconstructing a more prosodic sound. And luckily, there was a perfect dataset just waiting to be analyzed.

Over a decade ago, 29 patients with treatment-resistant epilepsy took part in a study in which recordings of their brain activity were taken – using electrodes inside their brains – while they listened to a three-minute segment of the Pink Floyd classic Another Brick in the Wall, Part 1

Advertisement

At that time, in 2012, UC Berkeley professor Robert Knight was part of a team that was the first to reconstruct words that a person was hearing from their brain activity alone. Things in the field had moved on apace since then, and now Knight was leading the study with Bellier on the new problem of music perception.

Bellier reanalyzed the recordings and used artificial intelligence to come up with a model that could decode the brain activity recorded from the auditory cortex, and use it to reconstruct a sound waveform that aimed to reproduce the music the person had been listening to at the time.

spectrogram of original song (left), brain showing representative activity pattern as colored dots (center), reconstructed spectrogram (right)
The left panel shows the spectrogram of the original song the patients listened to, and the center demonstrates a typical neural activity pattern. The researchers used only these patterns to decode and reconstruct a spectrogram like that on the right, which is recognizable as the original song.
Image credit: Ludovic Bellier, PhD (CC BY 4.0)


For Bellier, a lifelong musician himself, the prospect was compelling: “You bet I was excited when I got the proposal.”

And the results are impressive.

Advertisement

In the reconstructed audio, the rhythm and tune are recognizable, and even the words, “All in all it was just a brick in the wall,” can just be made out.

The research also allowed the team to identify new areas of the brain involved in detecting rhythm – in this case, the thrumming of the guitar. The most important seemed to be part of the right superior temporal gyrus, which sits in the auditory cortex just behind and above the ear.

They also discovered that, while language perception happens more on the left side of the brain, music perception has a bias towards the right. 

Advertisement

Bellier and Knight, along with their co-authors, are hopeful the project could lead to an improvement in brain-computer interface technology.

“As this whole field of brain machine interfaces progresses, this gives you a way to add musicality to future brain implants for people who need it,” explained Knight. “It gives you an ability to decode not only the linguistic content, but some of the prosodic content of speech, some of the affect. I think that's what we've really begun to crack the code on.”

It would be particularly useful to be able to make the brain recordings noninvasively, but Bellier explained that we’re not there yet: “Noninvasive techniques are just not accurate enough today. Let's hope, for patients, that in the future we could, from just electrodes placed outside on the skull, read activity from deeper regions of the brain with a good signal quality. But we are far from there.”

One of These Days, that might be possible. But hearing music decoded only from brain activity still left us Lost for Words. And, as the authors concluded in their paper, they have certainly added “another brick in the wall of our understanding of music processing in the human brain.” 

Advertisement

The study is published in PLOS Biology.


ARTICLE POSTED IN

healthHealth and Medicinehealthneuroscience
  • tag
  • brain,

  • music,

  • brain activity,

  • neuroscience,

  • sound,

  • electrodes,

  • perception,

  • language processing,

  • brain-computer interface

FOLLOW ONNEWSGoogele News