Audio For Broadcast: I/O & Recording Devices
We explore the basics of physical connectivity & signal management encountered in broadcast audio systems alongside the destination recording devices.
All 16 articles in this series are now available in our free 78 page eBook ‘Audio For Broadcast’ – download it HERE.
All articles are also available individually:
Television studios are terribly exciting places, full of cool stuff. It’s all very glamorous, all this tech with flashing lights and sleek lines, but none of it functions on its own and the technologies which tie it all together onto the same massive network are fundamental to effective broadcast production.
The cables, the wireless RF and the bundles of copper which keep all these disparate parts in sync may be under appreciated, but they are integral elements of the broadcast infrastructure. It’s no good having a microphone which will commit to every breathy nuance if there’s nothing to plug it into.
What’s more, networks are becoming increasingly more complicated, where broadcasters are demanding more flexibility at the same time as transitioning from traditional SDI (Serial Digital Interface) networks to more convenient IP infrastructures.
In live broadcast environments mic sources just need plugging into something to transport those signals to a mixing console. For other television productions, those signals might need to be recorded as isolated inputs for editing and manipulation further down the line in a DAW (Digital Audio Workstation), or much further down the line in a MAM (Media Asset Management) system. In both cases, those signals need to be managed, catalogued and labelled correctly.
But we’re getting ahead of ourselves.
Let’s Get Physical
In a typical live studio environment there are I/O boxes distributed throughout the building, both permanently installed into walls and in portable racks. They have physical input and output sockets and come in a range of connectivity flavors such as XLR, BNC, D-type, EDAC and Cat5. They may be of a fixed format or made up of modular cards of different formats to allow broadcasters to build exactly what they need.
Inputs and outputs are usually managed on a mixing console where they are labelled and routed appropriately, and the I/O box can be thought of as an extension of the console; for example, if a mic requires phantom power, the I/O box will deliver that power though the XLR cable, so if a condenser mic isn’t working it may be a quick fix to switch in the 48v button on the console.
On a digital console the I/O is traditionally managed using a straightforward matrix which connects inputs and outputs, but modern broadcast facilities have much more crossover between IT and traditional broadcast networks, signal management is changing and is influencing how or even if I/O is managed by the console.
As more broadcasters shift to IP environments, broadcast engineers need to appreciate how requirements are changing; even where IP isn’t the primary focus there is often some kind of IP connectivity in the mix, such as a Dante network.
While many broadcasters are focused on ST2110 and the encoding and synchronisation of media streams, device discovery and connection management are also key considerations. This sort of thing used to be handled by proprietary I/O systems, often from the console manufacturer, but in this interoperable era of IP there are new requirements.
The Joint Task Force on Networked Media (JT-NM) is a consortium dedicated to this very challenge, and its IS-04 recommendation allows devices to advertise their streams to a controller, while IS-05 forms connections between different devices. As more IP equipment converges and interoperability continues to expand, I/O is increasingly likely to be managed by agnostic stream managers which cover the entire broadcast workflow, irrespective of manufacturer and IP endpoint.
The subject of system wide routing and asset sharing is covered later in this series in a dedicated article.
On Location
Outside of a studio environment, there are even more factors to consider. Outside broadcasts have the same audio requirements as studio-based networks. Signals still need to be transported, intercoms still need to connect people and networks still need to be managed in real time.
Outside of the studio things like geography and weather can get in the way, and distance and power take on even more significance.
Golf is a good example as a major golf tournament has more than its share of challenges when it comes to I/O. The average size of a UK golf course is 111 acres, and in the US it’s even bigger. With upwards of 50 effects mics distributed across an entire course cabling can be a challenge, especially as analogue mic level signals suffer from noise and interference on longer cable runs.
Temporary, ad hoc networks like these also require complex management for RF, especially as more and more sports - including golf - are miking up players with wireless mics to expand the narrative of the event. Tools like mic gain and phantom power, signal management and discovery are still essential, and there are additional considerations for power, security, ruggedness, size and weight.
Down And DAWty
Away from live television, capturing audio for future manipulation, such as in a DAW, introduces additional requirements.
The days when scenes were simply captured by a boom operator onto mono or stereo tracks are long gone. As technology has advanced, television directors have become more creative and audio has become more challenging. Today, programme makers might record a scene using multiple cameras to give them more creative scope in the edit. That’s an issue for the poor sound op, who has to keep a boom out of shot, and so wireless lavalier mics might be more commonplace, or mics might be hidden in the scene to capture the audio.
These all need to be isolated (ISO) as individual tracks, because even though you could mix them down into a single mix for post production, what if one has an issue, falls over, or rubs against some clothing? Reshoots can be costly to reschedule, but ISO tracks can be remixed after the event to cover up any shortcomings.
Much of reality TV works in this way. Reality shows are incredibly popular and are cheap to produce, but the stories are all created in the edit. Because it’s unscripted the production team can’t know what content they will be able to use, which makes recording ISO tracks of every contestant an essential part of the workflow.
Multitrack field mixers and multitrack recorders make this possible and do so in very compact packages, but audio capture is just a small part of what they do. Metadata and timecode are essential to recording television sound of this nature as all the audio must be married to video down the line to ensure tight lip syncing.
Audio recorders and cameras are synced to a common timecode and are timestamped so that when they are imported into a DAW or a non-linear editor the audio and the picture can be snapped together.
Meanwhile, the audio files will also contain metadata which a production sound operator will need to add to the recording so that post-production editors can identify what each recording actually is, as well as descriptive data and workflow information.
The standard audio files which broadcasters use are Broadcast Wav Files (BWF), an EBU-specified format dating back to 1996 which allow files to be exchanged between DAWs in radio and television production (if you are terribly interested, you can read all about it in EBU - Tech 3285).
The BWF is a development of a standard WAV format and includes additional extension “chunks” of data which provide context for each recording. A broadcast extension chunk (known as “bext”) contains information on aspects like the title, origination, date and time, while an iXML chunk is an open standard for embedded metadata in production media files which includes scene, take and notes information.
Irrespective of the location of ingest into a workstation, without this extra information the content is largely unusable. Once a template is configured on a DAW such as Pro Tools, it will reliably pull the metadata labels from the tracks, and while this isn’t a particularly creative process, configuration of the workstation is a specialist job which must be done.
In The Field
In addition to blackbox recorders, mics on set may also feed into a field mixer. Field mixers are small, rugged and lightweight mixers that are used for bagwork, where sound operators can either strap all the equipment to them for complete mobility, or on a cart where sound operators create their own moveable workstation.
These portable mixers can share many of the same features as the main broadcast mixer, with comprehensive bussing, automix facilities and top-of-the-range mic preamps. In addition to using internal SSDs and/or SD cards/USB for recording, some can also tap into cellular connectivity to upload deliverables directly to the cloud using a service like Adobe’s Frame.IO for even faster collaboration with creative teams.
They will also often use outputs as well as inputs; a sound mixer on location may also have to feed monitor mixes and IFB mixes to various people on set, so multitrack mixers will often have comms and IFB feeds built in. And of course you can be anywhere, so resistance to heat, dust, cold and humidity is important, as is battery efficiency. Remote equipment doesn’t always have the luxury of unlimited hard-wired power like its studio-based counterparts.
Many Ways In And Out
Whether on set or on location, there are many ways a production might design its programme infrastructure, but in the same way as mic choice is totally application-led, connectivity will also be defined by the environment.
But those tielines are no less vital to the entire production. They are the central nervous system of the whole thing. Even if they’re not especially cool.
Part of a series supported by
You might also like...
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.
Microphones: Part 1 - Basic Principles
This 11 part series by John Watkinson looks at the scientific theory of microphone design and use, to create a technical reference resource for professional broadcast audio engineers. It begins with the basic principles of what a microphone is and does.
Audio For Broadcast: Cloud Based Audio
With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…
Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G
The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.
Next-Gen 5G Contribution: Part 1 - The Technology Of 5G
5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.