The second work composed specifically for the Trinity 360 project has now been released on YouTube and spatial audio should now be supported on most browsers and mobile devices (including iOS). The piece consists of an acoustic quartet (guitar, cello, flute and saxophone) arranged symmetrically around the central recording position in the debating chamber of Trinity College Dublin. Many, many thanks to Kate Ellis, Nick Roth, and Lina Andonovska for their fine performances.
This performance actually took place in June of last year but as the recordings were used as part of our Ambisonic Microphone study (discussed in the last blog post here), we decided to complete that research before releasing the video of the performance.
The compositional aesthetic here follows a more traditional contrapuntal approach in the form of a modified round, a form of strict canon in which each part performs the same melody but starting at different times. The title of A Round, Around reflects this approach and the spatial arrangement of players, and the use of rotations and other spatial effects created by passing musical material between consecutive instruments, as can be seen in the score excerpt below.
The recording presented here is based around a CoreSound TetraMic, and multiple monophonic spot (AKG 414s) and room microphones (AKG C314s, and Sennheiser TLMs) arranged in a circle of 1m radius and at 90 degree intervals, as shown below.
As always with this type of production, it is critically important that additional processing is applied to the instrument spot mics so that the perceived distance of the audio broadly matches the visible distance of the performers in the video. In general, this has been one of the biggest differences I’ve encountered when preparing a mix for a 360 video when compared to a traditional audio-only mix. This is particularly important if you are not mixing directly to picture or solely viewing the video on a desktop display, as in those scenarios, it is all too easy to underestimate the auditory distance required when viewing the content on a VR headset.
The precise way in which we synthetically alter the perceived distance of these close, spot-mic recordings remains an under-explored topic, particularly in the context of 360 video and VR, and this is an area which we are currently investigating. For this video, the instrument spot mic recordings were processed using a method  suggested by Peter Craven, and one of the primary inventors of Ambisonics, Michael Gerzon. This method uses a fixed pattern of early reflections as the primary means of altering the perceived distance, which makes it highly efficient and also allows for a certain amount of customization of the distance processing for different types of sounds. Here, the spot mics were processed using an implementation of Gerzon’s Distance Panpot created by one of our students in the MMT programme in Trinity College, Eoghan Tyrrel, and you can read more about his work here. While the results are pretty good for certain instruments (the guitar for example), other instruments are still perhaps perceived as sounding a little too close. This suggests that some form of source-specific distance processing might be worth investigating, particularly in terms of customizing the degree of amplitude attenuation and the number and pattern of early reflections for different types of instruments and sounds. This is an area of research we will be looking at in some detail over the coming years, particularly in the context of Augmented Reality (AR) applications.
The video was captured with our original experimental camera system based around 12 GoPros, and stitched using the Auto Pano Video application which is also used with the GoPro Omni camera system. As is pretty clear from the video, the GoPro’s performed reasonably well for exterior shots, but far less so inside the venue and the overall picture quality for the performance itself is relatively poor. This is perhaps to be expected for an action camera with low dynamic range like the GoPro but once again it really emphasizes the importance of lighting for 360 video shoots with these types of camera rigs. The production company Visualise have recently written an excellent blog post on different lighting strategies for 360 video shoots which is well worth a read.
Adobe Premiere Pro CC was used for the video edit itself and Premiere now includes lots of support for 360 video monitoring, editing, and processing. We can now preview both Ambisonics audio and 360 video from within Premiere, and on export we can also add the metadata tags needed to tell YouTube that this is a 360 video with spatial audio (for more information on 360 video support in Premiere, see this article).
In addition, Adobe have recently acquired the Mettle SkyBox 360/VR tools which add a number of very useful features to Premiere when working with 360 video. While these plugins will eventually be incorporated directly into Premiere, for now anyone with a subscription to Adobe Creative Cloud can get the plugins for free by emailing Adobe (full details here).
The SkyBox suite includes a number of plugins specifically designed to work with 360 video, whether monoscopic or stereoscopic. This includes some practical post effects such as Blur, Denoise, Sharpen, and Glow, as well as more creative effects such as Colour Gradients and Fractal Noise which can be applied directly to 360 footage without any distortion along the seams. There are also plugins to enable the rotation of the 360 footage, and a number of transitions designed specifically for 360 content.
However, on a practical level, perhaps one of the most useful plugins in this suite is the SkyBox plugin which can be used to warp standard text titles created in Premiere so that they appear correctly when played back in a 360 video platform such as YouTube. Mettle have a number of tutorial videos which go through all of these different plugins, however, here’s one which outlines the relatively simple workflow for creating 360 video text titles in Premiere.
So, if you are working with 360 video in Adobe Premiere, I strongly recommend you pick up a copy of these plugins.
 M. A. Gerzon: The Design of Distance Panpots, Proc. 92nd AES Convention, Preprint No. 3308, Vienna, 1992.