Zoom H2n Conversion Plugin

My colleague Brian Fallon has recently created a First-Order Ambisonic Encoder Reaper plugin for the Zoom H2n portable recorder which you can download from the link below;

H2n-FOA-Encoder-Package.zip

h2n-plugin
While the H2n is far from the best Ambisonic microphone available, it is certainly one of the most affordable and produces surprisingly usable results given its cost (although due to the geometry of the H2n’s microphone capsules, it is horizontal only). Zoom released a firmware update for the H2n earlier this year which allows for horizontal only Ambi-X audio to be recorded directly onto the recorder. However, it can sometimes to be useful to record in the original 4-channel mode (so you have access to the original stereo tracks) and convert to Ambisonics later. In addition, if you made 4-channel recordings with the H2n prior to the release of this Firmware update, then this plugin can also be used to convert these into Ambisonics.

Brian’s plugin is for the DAW Reaper and can be used to convert these H2n 4-channel recordings into horizontal B-format Ambisonics and also allows you to choose various output channel orders and normalization schemes (Furse-Malham, Ambi-X, etc.). The package includes a sample Reaper project and a manual with details on the recording and plugin setup.

Note, that if you own the older H2 recorder which has a slightly different microphone arrangement, then Daniel Courville’s VST and AU plugins can be used for conversion to B-format in a similar fashion.

 

Advertisements

Soundscapes, VR, & 360 Video

 

Over the past few months we’ve been busy presenting our work at the AES and ISSTA conferences, through masterclasses,  workshops, and public demonstrations such as Dublin Science Gallery’s event, ‘Probe: Research Uncovered‘ at Trinity College Dublin last month. In our research, we are continuing to evaluate different recording techniques and microphones and we have also recently acquired a new 360 camera system (a GoPro Omni to be exact) with a much simpler and faster capture and video stitching process compared to our experimental rig (although monoscopic only of course). We’ll have more information on that camera system in the coming weeks.

As a composer, one of the things I find most fascinating about VR and 360 video is its relationship to soundscape composition. Composers have been making music from field recordings for many decades from the electroacoustic nature photographs of Luc Ferrari, to the acoustic ecology of The World Soundscape Project (WSP) and composers such as Murray Schafer, Barry Truax, and Hildegard Westerkamp, and the music and documentary work of Chris Watson, to give just a few examples (the Ableton Blog has a nice article on the Art of Field Recording here).

Of course, in the world of VR and 360 video the soundscape serves an important functional role as a means to increase the sense of immersion and presence in the reproduced environment. In addition, the location of sounds can be used to direct visual attention towards notable elements in the scene. It has been said of cinema that “sound is half the picture” but this is perhaps even more true in VR!

The combination of these two areas is therefore deeply interesting to me, both in terms of how we might create music soundtracks for 360 videos and VR games that are created from the natural recorded soundscape and sound design, but also in terms of how we might use 360 video for the presentation of soundscape compositions.

Although it may seem somewhat counter-intuitive, this ability to control and perhaps also remove the visual component can be used to focus the attention on the audible soundscape in a potentially interesting way. While loudspeakers or headphones can provide an effective sense of envelopment within an audio scene, there is inevitably a conflict between the visual perception of the loudspeakers and/or reproduction environment, compared to the recorded soundscape. In the context of 360 video, the composer has in contrast complete control over both the visual and audible scene which opens up some interesting creative possibilities.

This type of environmental composition, which makes use of both 360 video and spatial soundscapes is the next focus of this project and we should have new work online in the coming months. However, in the meantime I’d like to recommend an award winning VR experience which has inspired my work in this area. Notes on Blindness is a documentary film based on the audio diaries of John Hull and his emotive descriptions of the sensory and psychological experience of losing sight and blindness. The accompanying VR presentation utilizes spatial audio and sparse, dimly lit 3D animations to represent this experience of blindness in a highly evocative manner. Released for the Samsung platform earlier this year, the VR experience is now available as a free app for iOS or Android and is highly recommended.