Saturday April 27th, 2024
Download SceneNow app
Copied

XP News: Google researchers use AI to turn brain scans into music

New research by Google and Okasha University unveils the potential of AI to create streamable music from human thoughts.

Scene Noise

XP News: Google researchers use AI to turn brain scans into music

Researchers from Google and Okasha University in Japan have released a published study which shows that streamable music can be created from human thoughts.

Five volunteers underwent fRMI scans while listening to music from 10 different genres. Their brain activity was observed, and images taken of them. Using an AI model developed by the researchers called Brain2Music, the researchers used Google’s music production tool MusicLM to create songs based on those that the volunteer subjects were listening to.

The AI-produced music is a reflection of the stimuli experienced by the volunteers based on semantic factors such as the subjects’ mood, as well as the genre and instrumentation of the music.

While spotlighting Google’s AI, the study also contributes to our understanding of how musical stimuli interact with brain activity. The study could help pave the way to the creation of software capable of translating human thoughts into musical compositions.

×