Soundscapes, VR, & 360 Video

 

Over the past few months we’ve been busy presenting our work at the AES and ISSTA conferences, through masterclasses,  workshops, and public demonstrations such as Dublin Science Gallery’s event, ‘Probe: Research Uncovered‘ at Trinity College Dublin last month. In our research, we are continuing to evaluate different recording techniques and microphones and we have also recently acquired a new 360 camera system (a GoPro Omni to be exact) with a much simpler and faster capture and video stitching process compared to our experimental rig (although monoscopic only of course). We’ll have more information on that camera system in the coming weeks.

As a composer, one of the things I find most fascinating about VR and 360 video is its relationship to soundscape composition. Composers have been making music from field recordings for many decades from the electroacoustic nature photographs of Luc Ferrari, to the acoustic ecology of The World Soundscape Project (WSP) and composers such as Murray Schafer, Barry Truax, and Hildegard Westerkamp, and the music and documentary work of Chris Watson, to give just a few examples (the Ableton Blog has a nice article on the Art of Field Recording here).

Of course, in the world of VR and 360 video the soundscape serves an important functional role as a means to increase the sense of immersion and presence in the reproduced environment. In addition, the location of sounds can be used to direct visual attention towards notable elements in the scene. It has been said of cinema that “sound is half the picture” but this is perhaps even more true in VR!

The combination of these two areas is therefore deeply interesting to me, both in terms of how we might create music soundtracks for 360 videos and VR games that are created from the natural recorded soundscape and sound design, but also in terms of how we might use 360 video for the presentation of soundscape compositions.

Although it may seem somewhat counter-intuitive, this ability to control and perhaps also remove the visual component can be used to focus the attention on the audible soundscape in a potentially interesting way. While loudspeakers or headphones can provide an effective sense of envelopment within an audio scene, there is inevitably a conflict between the visual perception of the loudspeakers and/or reproduction environment, compared to the recorded soundscape. In the context of 360 video, the composer has in contrast complete control over both the visual and audible scene which opens up some interesting creative possibilities.

This type of environmental composition, which makes use of both 360 video and spatial soundscapes is the next focus of this project and we should have new work online in the coming months. However, in the meantime I’d like to recommend an award winning VR experience which has inspired my work in this area. Notes on Blindness is a documentary film based on the audio diaries of John Hull and his emotive descriptions of the sensory and psychological experience of losing sight and blindness. The accompanying VR presentation utilizes spatial audio and sparse, dimly lit 3D animations to represent this experience of blindness in a highly evocative manner. Released for the Samsung platform earlier this year, the VR experience is now available as a free app for iOS or Android and is highly recommended.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s