Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Brain Signals Reconstruct Pink Floyd Song

Date

Brain recordings detected Pink Floyd.

From the recorded activity of intracranial electroencephalography (EEG) electrodes placed on the brains of 29 epilepsy surgery patients, the phrase “all in all, it was just a brick in the wall” from Pink Floyd came through — its rhythm intact, its lyrics muffled yet decipherable, reported Ludovic Bellier, PhD, of the University of California Berkeley, and co-authors in PLoS Biology.

“We found it was feasible to reconstruct a recognizable song from neural activity with a relatively small dataset — a short duration of only 3 minutes, even in single patients with as little as 10 or 20 properly located electrodes — which is promising for future BCI [brain-computer interface] applications,” Bellier told MedPage Today.

“Also, we identified a novel superior temporal gyrus subregion tuned to musical rhythm that, upon further confirmatory research, could constitute a target of choice for music decoding,” he added.

The findings show it’s possible to decode neural activity that capture musical elements — rhythm, stress, accent, and intonation — that carry meaning lyrics alone don’t convey, noted co-author Robert Knight, MD, also of the University of California Berkeley.

“One of the things for me about music is it has prosody and emotional content,” Knight said.

“As this whole field of brain-machine interfaces progresses, this gives you a way to add musicality to future brain implants for people who need it, someone who’s got ALS or some other disabling neurological or developmental disorder compromising speech output,” Knight continued.

“It gives you an ability to decode not only the linguistic content, but some of the prosodic content of speech, some of the affect,” he said.

A number of studies have decoded speech from brain signals, and functional MRIs have shown that neural decoding of speech and melodies depends on activity patterns in left and right auditory cortex regions, respectively. While both music and speech are both complex acoustic signals relying on a hierarchical information structure, music has been more challenging to reconstruct, Bellier noted.

Another Brick in the Wall, Part 1” filled the surgery suites at Albany Medical Center in New York when the epilepsy patients underwent intracranial surgery. The Pink Floyd song was used because it constituted a rich and complex auditory stimulus. Passive listening was chosen so researchers could observe how patients detected and perceived musical information.

Overall, neural activity was recorded from 2,668 electrodes in 29 patients. The electrodes were laid directly on the patients’ cortical surface. Activity at 347 electrodes was related to the music, mostly in the superior temporal gyrus, the sensory-motor cortex, and the inferior frontal gyrus.

Encoding and decoding signals showed a right-hemisphere dominance for music perception, with a primary role of the superior temporal gyrus. “Language is more left brain,” Knight noted. “Music is more distributed, with a bias toward right.”

The researchers also identified a unique superior temporal gyrus subregion that was attuned to the guitar rhythm in the Pink Floyd song. To further study how the brain perceived music, they removed different data and compared reconstructions with the actual song. Reconstructions were affected most when electrodes from the right superior temporal gyrus were removed.

Recordings like this may one day help reproduce the musicality of speech that’s missing from today’s robot-like reconstructions, Bellier noted.

“We provide insights both on the fundamental side, by furthering our understanding of how the human brain supports music perception, and on the applied side, by paving the way for including prosodic elements into speech decoders,” he said.

  • Judy George covers neurology and neuroscience news for MedPage Today, writing about brain aging, Alzheimer’s, dementia, MS, rare diseases, epilepsy, autism, headache, stroke, Parkinson’s, ALS, concussion, CTE, sleep, pain, and more. Follow

Disclosures

This work was supported by the Fondation Pour l’Audition, the NIH’s National Institute of Biomedical Imaging and Bioengineering and the NIH’s National Institute of Neurological Disorders and Stroke.

The researchers reported no competing interests.

Primary Source

PLoS Biology

Source Reference: Bellier L, et al “Music can be reconstructed from human auditory cortex activity using nonlinear decoding models” PLoS Biol 2023; DOI: 10.1371/journal.pbio.300217.

Please enable JavaScript to view the

comments powered by Disqus.

Facebook
Twitter
Reddit
LinkedIn
Email

More
articles

Join DBN Today!

Let DBN help guide you to success!

Doctors Business Network offers everything new and existing health care providers need to establish and build a successful career! Sign up with DBN today and let us help you succeed!

DBN Health News