Audio For Broadcast: Outside Broadcast Workflows

Outside broadcast adds layers of complexity to audio workflows. We discuss the many approaches to hybrid remote production and discuss the challenges of integrating temporary or permanently distributed production teams.

Outside broadcasting is almost as old as inside broadcasting. The first acknowledged BBC Outside Broadcast (OB) covered the coronation procession of King George VI in 1937. In May 2023, 86 years later, the BBC captured similar amounts of flag waving, gold carriages, pomp, crowns and national pride for the coronation of King Charles III using a fleet of OB units. It was a big deal then, and it is a big deal now.

It turns out that escaping the confines of a television studio to cover a live event where it is actually taking place is very liberating. And an outside broadcast unit which contains everything that a television studio uses for live broadcasting does exactly that.

As live event coverage – especially sports – became bigger and more immersive, traditional OBs also got bigger and more complex, and stuffed with more kit.

Squeezed Out

Sometimes, this is not ideal. In an OB, audio is often a challenge. For starters, a traditional OB truck focuses on video production, with expanding sides creating tiered galleries and banks of VT screens for entire teams of people. In contrast, the sound room is often crammed into a much smaller space, with as many as one audio engineer.

For mixing in immersive formats like 5.1 surround or Dolby Atmos, the space has never been ideal, while the challenge of squeezing a physical console into the available width is also somewhat challenging.

And audio in an OB is not just about mixing.  The audio department is also responsible for managing all the broadcast audio feeds from microphones and line-level sources, which means connectivity is a big part of an OB setup.  An event like golf can cover 200 acres of land and use up to 30km of cabling and setup takes days if not weeks. The OB is also responsible for establishing communications between onsite personnel and personnel at the broadcast center, as well as backhauling content between the OB and a centralized broadcast hub.

But these things are changing. As connectivity evolves with more efficient protocols and the adoption of technologies like private 5G and mesh networks to take care of transport, audio engineers and network designers are already benefitting from shorter setup times and lower costs.

Moreover, connectivity is not only helping to create efficiencies onsite, and it’s here where the big changes are happening.

Connectivity Is Outside Broadcasting’s Best Friend

For many live broadcasts, the traditional OB model just isn’t practical.

No two ways about it, OBs are expensive. It’s not just the electricity needed to run the production, or the time it takes to set up and pack down, or the expense of getting the truck with all its equipment to the venue, or the downtime in between shows when the truck is travelling; it’s also the cost of shipping dozens of crew onsite to manage the production, and the environmental impact of all that. It means that only the biggest events can justify these levels of resource.

For a while in 2020 and 2021, it wasn’t especially safe either. Or legal. Global social distancing meant that broadcasters were unable to staff outside broadcast events and as world events forced the issue, it ushered in entirely new ways of working. Long discussed, remote broadcast and distributed working were a necessity rather than a concept and as more powerful backhaul technologies continue to enable these ways of working, every OB broadcast is now assessed on individual merits.

While many high-profile events, such as the Open Golf or the World Cup, can justify having skilled personnel and equipment onsite, many broadcasters no longer commit to doing things the way they have always done them, and hybrid models where audio processing is distributed across sites are much more common.

But as usual, audio needs to be very carefully handled. And once again, it’s all about the latency.

Remote Production

Whatever you define remote production as, it is here to stay with more Remote Operation Centers (ROCs) being built, and OB trucks being built to support them.

Flexibility is a powerful selling point; there are many ways to approach remote production, and no single solution that works for every scenario. And that’s kind of the point; remote production empowers broadcasters to use whatever resources are necessary to do the job.

Remote production (sometimes referred to as REMI, or At-Home Production) is where content is captured onsite at an event and backhauled to a centralized hub like a ROC for mixing. From an audio perspective there are some immediate and clear benefits. It means that content being mixed in an immersive format can be monitored and mixed in a studio which is acoustically designed to do the job rather than a cramped audio room on the end of a truck.

It means that audio operators can be utilized more efficiently; not only do they no longer have to travel to wherever the event is, but they can work on multiple broadcasts one after another without even leaving their seat. Travel costs are reduced, the impact on the environment is positive, and quality is consistent because broadcasters always have their brightest talents on the job.

Latency

Although the Operator isn’t travelling, the signals still have to. That takes time, and that’s why latency is such an issue with all remote production workflows. The challenges are latency for in-ear monitoring, latency for comms, and latency for control.

Onsite talent needs to hear themselves in their earpieces; they also need interruptible foldback (IFB) mix-minus feeds from the studio, not to mention production comms and two-way communication with other hosts on the broadcast. Distance adds latency to all these things.

Onsite the priority is always on latency for in-ear monitoring with latencies over ten milliseconds generally understood to compromise performance. But if those signals are being processed onsite, in the same location, the geography (and the issue) goes away.

Living On The Edge

This is called Edge Processing, and it is when processing takes place on the edge of the network. IEMs are never intended to go to broadcast so it makes sense to process that audio on location. It can be done in a variety of ways.

One way is to use a dedicated DSP processing unit which is remotely controlled by the console in the ROC, while another is to remotely control the DSP processing engine of an existing console which is already onsite in a truck, which was exactly how many companies got around Covid restrictions back in the day.

This system is so efficient that broadcasters are still doing it and smaller OB units designed purely for remote connectivity are being built with an audio desk installed to provide disaster recovery control should connectivity go down. This system ensures that while all audio processing remains local, embedded audio stems can still be backhauled for manipulation in a centralized production facility.

Control latency – the time it takes to manipulate the signals from the central console – remains an issue, but because we are moving less data it is much more forgiving than in-ear monitoring, and while the location of the processing core can have a big impact on that too, we’ll look at it in more detail when we discuss cloud workflows.

Distributed Production

The same principals can be applied to distributed production, another stop-gap workflow which allowed people to continue to produce content while remaining socially distanced in the early 2020’s.

With the ability to control processing from anywhere on the planet, distributed workflows differ from remote broadcasting in that they provide control over processing cores from anywhere with a stable internet connection – a kitchen or a garage, a coffee shop or a shoe shop.

Not to be confused with cloud production, these workflows provide control of a processing core over the internet, which is not the same as using the cloud to run microservices or take care of any actual processing. Using physical hardware or virtual consoles which can run on a PC or laptop (also known as “headless mixers”), distributed controllers are able to tap into edge or on-prem processing cores (where “on-prem” refers to processing hardware located at the centralized facility).

The same latency challenges apply, but these working models also enable monitor mixes to be set up wherever the user is, and most web apps also provide assistive features such as automixers that can reduce the burden on the operator.

It means operations like mic gain, fader level and signal routing can be managed remotely, while headless mixers can also be used by audio personnel at an event to check whether mics are properly connected prior to broadcast. In this way they also play into remote workflows.

The OB Is Not Dead Yet

People fear for the death of OBs, but even now this isn’t looking likely. Although remote infrastructures have matured hugely in a relatively short space of time, big productions still demand personnel and kit in the shadow of the stadium.

That said, top tier sporting extravaganzas like Formula E and Sail GP are also proving that remote production is more than capable. With more channels to fill, and content providers looking for more online and streamed content to build fan engagement, it has also opened the door to cost-effective coverage of second and third tier events, while keeping production standards high.

Not to mention the positive effect it is having on the environment.

These hybrid models are empowering production. Today, OB units are providing remote connectivity back to a ROC; at the same time, traditional on-prem hardware is doing the lifting; and hybrid models are doing both with some production onsite and some remote.

There’s no simple picture, but one thing is certain. Technology is empowering broadcasters to choose the workflow to match the production, rather than the other way around. 

Supported by

You might also like...

The Big Guide To OTT - The Book

The Big Guide To OTT ‘The Book’ provides deep insights into the technology that is enabling a new media industry. The Book is a huge collection of technical reference content. It contains 31 articles (216 pages… 64,000 words!) that exhaustively explore the technology and…

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

Next-Gen 5G Contribution: Part 1 - The Technology Of 5G

5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…