The process of composing and mixing music is something most people find to be opaque and confusing. Though the rise of programs such as Audacity has made things a little more understandable, it still confusing to many. A collaboration between podcast Song Exploder and Google hopes to change that using virtual reality (VR).
The new collaboration, called Inside Music, involves taking a more detailed look at the musical output of artists such as Phoenix, Perfume Genius, Natalia Lafourcade, Ibeyi, Alarm Will Sound and Clipping. The project aims to bring music to life, using a Spatial Audio process to show users the layers of music that are involved in a song.
Users can load up Inside Music and select a song to see the various elements of the song laid out around them, with things such as the piano, guitar and vocal separated out. It is even possible to turn the various layers on or off, allowing the user to appreciate the different elements in isolation.
Google have used WebVR, the open-source browser-based VR standard for the project, meaning that all that is required to experience Inside Music is a web browser. Inside Music can be viewed using a VR headset, or just using a PC or smartphone.
The code for Inside Music has been put on code repository GitHub, so anyone who makes music and can do a little Java programming, or can find someone who does, will be able to experience their music in VR.
You can view a video on how Inside Music works below. Further information can be found at the Inside Music website.
VRFocus will continue to report on new innovations in VR technology.