Vendor Content.

Remote Management And Listening

So far in this series of articles the focus has been on the underlying requirements for successful audio over IP connectivity within a broadcast facility. The connectivity can be considered to be local; it is over a short distance, so the latency is low and it’s not a problem for the signal bandwidth to be high. In this situation the broadcaster has complete control over the network, including switch configuration and the routing of traffic over the network, so signal bandwidth and latency can be guaranteed.

Studio-based production is just one area which can benefit from IP connectivity and this article will look at audio monitoring requirements for the burgeoning world of remote production.

Live coverage of events outside of the television studio – typically news and sport – has been in place for as long as television broadcast itself. Outside broadcasts (OBs) were based around a mobile facility (the ubiquitous OB truck) providing all of the familiar equipment that would be found in a traditional studio, just with less room to move. With the advent of new connection technologies driving up the speed and bandwidth available in IP and cellular connectivity there has been a broadening in the way live event productions are created, leading to remote production becoming more and more viable. There are a few different approaches to remote production, each with their own technical and cost considerations. But whichever approach is chosen the operational staff involved in the production still need a way to choose and control what they are listening to.

Remote Production Models

Key remote production models include centralized (where the capturing equipment is at the remote venue and processing equipment is in the studio), remote controlled (where the capturing and processing equipment is at the remote venue but is controlled remotely from a studio) and distributed (where the capturing, processing and control could be spread across a number of physical locations).

Audio monitors typically receive many audio signals simultaneously and present these to the operator for them to select just those that they need to hear. The way in which the signals may be delivered to an audio monitor will differ depending on the remote production model being used.

For centralized remote productions all of the audio signals will be transported to the studio and can be presented to the audio monitor in the ‘usual manner’. Even when using compressed signal formats, there will be significant bandwidth requirements to transport these individual signals between the outside location and studio, in addition to the latency introduced by the distance and signal compression. But if these challenges are surmountable then the audio monitoring requirements can be managed just as they are in a traditional studio, with individual hardware units fed multiple signals from the centralized, high-bandwidth network within the studio facility receiving the outside signals.

For the remote controlled and distributed models there will be operational staff who are not in the same location as the bulk of the signal processing. The connection between the operator’s position and the signal processing is likely to be lower bandwidth and higher latency, meaning that it’s not so easy to send all the individual audio streams to an audio monitor in the operational position.

There are solutions which enable the transmission of high channel count audio across the internet, such as Unity Connect from Unity Intercom. This means that it is possible for a remote operator to be working from home and have an audio monitor with suitable connectivity (such as Dante or MADI) receiving all the necessary production audio signals over the internet. This approach works and is in use by several broadcasters today. But is this the most efficient way to provide remote audio monitoring?

Controlling The Process

It is important to consider what the operator needs to hear when using an audio monitor: they only need to hear the output of the unit. Typically, this will be stereo and coming through the internal speakers, connected headphones or external loudspeakers. If only the output of the audio monitor needs to be delivered over the low-bandwidth, high-latency connection then this makes implementing a remote position much easier. An audio codec could be used to deliver the audio over a very low bandwidth, with relatively low latency and audio quality that is perfectly good enough for audio monitoring requirements. Given the low bandwidth of digital audio it is also entirely feasible for this to be a lossless transmission.

If only the output of the audio monitor is being sent to the remote position, then this means the unit providing the processing can be located with the bulk of the processing equipment. Here it can easily receive the full gamut of high-quality, low latency signals. It then becomes a question of control – how is the operator going to control the audio monitor remotely?

Audio monitors such as TSL’s MPA1 range allow remote control of all operational aspects, either through a web interface or an external control protocol. In the distributed production example, an operator could be at home and just use a pair of headphones or speakers to listen to the low-bandwidth output of the audio monitor delivered via a codec, with the audio monitor physically located with the rest of the processing equipment. The operator can use the web interface to control the selection of what they are hearing and to view the audio metering data.

Taking this a step further it would be possible to provide the remote operator with a physical control interface. This could be a dedicated interface - identical to that found on the front panel of an audio monitor - delivering the key functions: speakers, audio metering and dedicated hardware controls, without the full signal processing back-end being required. Or, if the operational requirements are simple enough, basic control functionality such as source selection could be mapped to a shared control panel.

With separated processing and control it becomes possible to pool the processing for multiple audio monitors being used on the same production, reducing the physical installation requirements and centralizing management of units. The control of the audio monitors could still be tailored to every position – some with dedicated physical hardware, others with shared hardware or software control. The control requirements could even be changed on a production-by-production basis.

Conclusion

As technology develops, this processing is likely to move to shared resource, whether dedicated on-prem servers or in the cloud. As broadcasters continue to look for flexibility and value from their investments, we will see increased interest in the pooling of resources and design flexibility.

IP connectivity has provided the technical backbone for remote production to become a viable solution for many, the flexibility and efficiencies that it offers means that it is unlikely to go away. These different approaches allow broadcasters to scale productions and facilities as and when appropriate. Regardless of the route taken, audio monitoring remains a vital function in broadcast production.