AI recreates clip of Pink Floyd song from recordings of brain activity
An artificial intelligence can guess what a song sounds like based on patterns of brain activity recorded while people were listening to it
By Carissa Wong
15 August 2023
An AI recreated a clip of Pink Floyd’s song Another Brick in the Wall, Part 1 from brain recordings
Smith Collection/Gado/Alamy
An artificial intelligence has created a passable cover of a Pink Floyd song by analysing brain activity recorded while people listened to the original. The findings further our understanding of how we perceive sound and could eventually improve devices for people with speech difficulties.
Robert Knight at the University of California, Berkeley, and his colleagues studied recordings from electrodes that had been surgically implanted onto the surface of 29 people’s brains to treat epilepsy.
The participants’ brain activity was recorded while they listened to Another Brick in the Wall, Part 1 by Pink Floyd. By comparing the brain signals with the song, the researchers identified recordings from a subset of electrodes that were strongly linked to the pitch, melody, harmony and rhythm of the song.
Advertisement
They then trained an AI to learn links between brain activity and these musical components, excluding a 15-second segment of the song from the training data. The trained AI generated a prediction of the unseen song snippet based on the participants’ brain signals. The spectrogram – a visualisation of the audio waves – of the AI-generated clip was 43 per cent similar to the real song clip.
Here is the original song clip after some simple processing to enable a fair comparison with the AI-generated clip, which undergoes some degradation when converted from a spectrogram to audio:
https://images.newscientist.com/wp-content/uploads/2023/08/14171701/audio-file-1.wav![endif]-->