The 2022 City of Culture festival concluded with a performance created by Nitin Sawhney CBE. Filmed on the URSA Broadcast G2, an edited broadcast of Ghosts In The Ruins aired on the BBC.
For its 60th anniversary, Coventry Cathedral’s historic grounds hosted the final performance. The poignant landmark holds centuries of historical significance within its walls, shaping the core principles of the cathedral’s values: the determination to build justice, peace and learning to live with difference. These themes were explored in the performance.
oXyFire Media Creative produced the performance coverage, which featured two essential parts. The first part was backlit by projections of ghosts wandering the grounds and featured an intimate performance of musicians emerging from the audience. The second part took place in the ruins with a bold and more extravagant performance.
Nitin Sawhney CBE, recipient of the Ivor Novello Lifetime Achievement Award, was commissioned to create the performance in response to Benjamin Britten’s War Requiem.
Producer Jay Rozanski, oXyFire, worked with Director Rhodri Huw and Camera Supervisor Tony Freeman to navigate the camera plan and the logistics of moving an audience between spaces.
“That was the big question we asked ourselves. How do you take an audience from the cathedral into the ruins, but on a budget? How do you physically move cameras and jibs? We settled on a hybrid model consisting of 12 camera positions. Moving the audience took half an hour, whereas with this setup the crew could move much faster. The sheer size of it all was challenging,” Rozanski explained.
The setup used ten URSA Broadcast G2 cameras with B4 mount Canon CJ lenses connected by 62km of fibre core and nine OB vehicles.
“Many parts of the show were almost in pitch black so a camera with good low light capabilities was vital for the production, not only creatively, but also to ensure that it passed the BBC’s technical specifications. We worked with technical manager David Hutchinson and his team who ensured the cameras worked as we needed them to.”
The performance was recorded in 4K p50, with the final output 50i for TV. The Blackmagic 12G optical fibre converter was used for SDI output. Although the performance aired as an edited broadcast, the OB set up was geared for a live stream.
One kilometre of fibre was custom built on location and breakout boxes were pre-built for efficiency. Redundancies were put in place where possible. However, this wasn’t possible throughout the location, which emphasised the importance of having a reliable workflow that could adapt to any problems mid-shoot.
The crew had three days to rig which coincided with rehearsals and blocking. The plan involved one camera stationed at the top of the bell tower for a wide angle shot which required fibre core and rigging rope to be hauled up. “From the top of the tower we could see a void of empty blackness. The light technicians put in a nice purple and white light in that space which looked great through the camera,” Rozanski shared.
“We acquired in 4K using Blackmagic RAW 8:1 variable bitrate, primarily because we could apply a LUT while still future-proofing the project.”
Timecode reference was sent from an ATEM Constellation 8K to sync all of the camera feeds, with Rozanski describing the production switcher as the ‘brains’ of the operation. “We chose the 8K due to its integrated talkback and camera control ability,” he explained.
Teamed with a Smart Videohub 40x40 12G, the OB used a StreamDeck with BitFocus Companion for control and HyperDeck recorders. “Rhodri wanted to use a switcher with as few buttons as possible so we opted for the ATEM 1 M/E Advanced Panel and created a Companion profile for the indoor location and one for outdoor.”
“This changed everything from switcher inputs to HyperDecks, to multiviews and of course the button mapping on the advanced panel. This ensured the director only had the 6 or 7 cameras per location he needed to direct at any given time.”
A Teranex Mini SDI to HDMI 12G unit was used to convert to LTC for the audio truck.
Rozanski continued, “We could record and sync the audio, which was important for the mix-out given that it was for Nitin Sawhney. Having a circular sound system worked well, and we were happy with the sound mix.”
“Working with Sawhney was quite unique – he worked with the local community using local talent, so the piece was built quite differently. It was special,” he concluded.
The performance was broadcast on BBC Four and received an RTS Midlands Award nomination in the Specialist Factual category.
You might also like...
Ensuring consistency of lighting between the virtual world and physical objects on set requires controlling production lighting based on image content.
Virtual Production For Broadcast is a major 12 article exploration of the technology and techniques of LED wall based virtual production approached with broadcast studio applications in mind. Part 4 examines image based lighting, new developments in RGBW LED technology and what i…
One of the creative advantages of virtual production for performers is seeing the virtual environment in which they are performing. Using motion capture techniques extends this into capturing the motion of performers to drive CGI characters. New technologies are rapidly…
Sometimes, there’ll be a need to represent real-world objects in the virtual world. Simple objects could be built like any VFX asset; more complex ones might be better scanned as a 3D object, something some studios have begun to c…
Sending out a crew to capture a real-world environment can be a more straightforward option than creating a virtual world, but there are some quite specific considerations affecting how the material is shot and prepared for use.