“The scope of this project is huge”: Inside the Royal Shakespeare Company’s interactive Dream machine
Hailed as next-generation virtual theatre, the Royal Shakespeare Company’s production of Dream, which streams March 12-20, is a live performance that combines motion capture with an interactive symphonic score.
Inspired by Shakespeare’s A Midsummer Night’s Dream, Dream promises not only to be a technical tour de force, but a pointer to how live virtual performances could evolve moving forward.
Pre-pandemic, the idea was to create Dream as a location-based immersive installation, Luke Ritchie, Head of Digital Innovation for the London-based Philharmonia Orchestra told Audio Media International “We were going to usethe d&b Soundscape system inside an abandoned department store in Stratford-Upon-Avon.”
The scope of this project is huge, says Ritchie. “Dream is the culmination of nearly two and a half years of R&D. Ultimately it was the £4m grant from Innovate UK that allowed us to formalise partnerships that had been developing over years into a consortium of 13 organisations. That funding has allowed us all to take the risk required to genuinely innovate, and there is a wealth of R&D that we have each pursued that will not even come to light in the final show, but we hope to share more widely across the sector.”
These innovations include spatial audio recording techniques for immersive formats (Augmented/Virtual/Mixed Reality), that create mixes able to give an audience full six-degrees of freedom (6DOF).
“For example, we recorded a string quartet in such a way that you could walk around the quartet in VR or MR and get close to each individual instrument, testing different microphone capture and spatialisation techniques.”
This in turn led to the development of audio playback technologies: “During this project we’ve experimented with mapping audio to volumetric image capture with Intel, and motion-capture, with Portsmouth University as a key component of Dream, for VR, AR and MR. Throughout all of this we’ve had to learn how to build musical experiences in real-time games (Epic Games’ Unreal) engines. For Dream we’ve brought on board Anastasia Devana as Audio Director on the project, who we first met in that post at Magic Leap, and she’s been such a fantastic person to work with.”
“Finally, a core aim for us was to try to reimagine composition for immersive formats. As well as being the Philharmonia’s Principal Conductor & Artistic Advisor, Esa-Pekka Salonen (pictured below) is a fantastic composer in his own right, and he’s genuinely curious in how we compose music for immersive worlds where the audience and performers can interact with and change the story being told.”
Salonen brought Jesper Nordin to the project, not only an orchestral composer, but also a coder, responsible for interactive music engine, Gestrument. It’s Gestrument that adds interactivity to the Dream production, explains Ritchie.
“Ultimately, Grestrument can allow a performer or audience member to play along to the music and remain in tune and in time. It can take any input, in this case motion-data from the actors, which then generates live music via MIDI of a particular melodic line. Creatively, we’ve worked with the cast to try to give each character a musical motif, so that their movements at key scenes can carry echoes of the music that is to come further on in the story.”
“Technically, we had the challenge of integrating this into a real time virtual production environment, porting Gestrument into Unreal Engine so that it could integrate with the environments, scenes and characters that we’re building. That also means that going forwards we could create another experience – a game, a VR experience or a show – in which audiences could interact with and alter a score or a live performance.”
You may also be interested in:
- d&b completes first event KSL Soundscape install
- We Make Events study: 43 per cent of live events businesses to close by June
- How does live come back? Frank Turner, Music Venue Trust and Concert Promoters Association speak out
- ‘If COVID hadn’t stopped European touring, Brexit would have’: Inside UK’s first live music industry body
- ‘The world has changed, buying behaviour has changed’: d&b launches new Subscription Series
This is the second time the RSC has motion captured a live performance in Unreal Engine, the first being its 2016 production of The Tempest. Dream will be performed with seven actors in a 7×7 metre motion capture volume, created at the Guildhall in Portsmouth.
So with live events still some way off a return to normality, does the experimental nature of Dream offer a glimmer of fresh hope for musicians and technicians?
“I hope that it does, but I think we’ve also got to be honest about the deep, lasting damage the pandemic will leave across the music industry. I have dear, talented friends who’ve had their lives completely up-ended over the last year. In fact the dates of Dream’s launch will be an anniversary for many of them of the last concert or recording they made. However, I do think that there is definitely hope. I think that this landscape needs musicians and music technicians to engage with it and start to make some amazing new experiences! I’d definitely recommend they try downloading Unreal and having a play.”
So at what point in this extraordinary process did Ritchie think: “What one Earth have I gotten myself into?”
“Several times!” he replies. “The thing that keeps me sane is the fact that we have an incredible team – which I’d love if you could credit. We’ve built a hybrid team comprised of Music Director Esa-Pekka Salonen, Composer and Interactivity Designer Jesper Nordin, Audio Director Anastasia Devana, Sound Designer Alessandro Coronas, Immersive Producer Dan Munslow, Live Audio Producers Phil Jones and Carlotta Piccini, Audio Systems Tech and Playback Operator Dan Halford, Technical Producer Ash Green and of course our incredible Orchestra and its staff.”
Tickets to the 50-minute live online event are available here.