Scientists for the first time have been able to reconstruct a recognizable song by listening to people’s brain waves. The team played a three-minute segment of Pink Floyd’s “Another Brick in the Wall” to volunteer patients. The patients had electrodes placed directly on the surface of their brains while they were undergoing surgery for epilepsy.
The electrodes were used to monitor and record the patients’ brain waves as they listened to the song. The team then used artificial intelligence to decode the recordings into a reproduction of sounds and words. The sound isn’t of the highest quality, but the lyrics, rhythm and melody of the song are still discernible. “It sounds a bit like they’re speaking underwater, but it’s our first shot at this,” said Robert Knight, a neurologist at the University of California in Berkeley who conducted the study with Ludovic Bellier, a postdoctoral fellow.
The idea is to find ways to improve methods of decoding communications from brain recordings, to help people who have lost the ability to speak communicate in their own voice. “Music, by its very nature, is emotional and prosodic — it has rhythm, stress, accent and intonation. It contains a much bigger spectrum of things than limited phonemes in whatever language, that could add another dimension to an implantable speech decoder,” Knight said.