Monitoring For Next Gen Audio Under Examination At BEIT NAB 2019 Conference

Monitoring is already recognised as a crucial issue in the implementation and operation of Next Gen Television systems. While the vision stream poses several major challenges for checking and quality control (QC) technology, sound could potentially be even trickier due to the different aspects of Next Generation Audio (NGA), which forms a major part of ATSC 3.0.

Immersive sound is the feature of NGA that most people would think of immediately but the format also covers alternative languages and commentaries plus a high degree of personalisation and interactivity. All of which need to be monitored, something that is already a challenge due to emerging systems being object-based rather than the purely channels approach of stereo and 5.1 systems.

These issues will be discussed at the 2019 NAB Show during the Broadcast Engineering and Information Technology (BEIT) conference session An Objective Guide to Audio Monitoring for Next Gen TV. The speakers, John Schur, president of the TV Solutions Group at the Telos Alliance, and Jim Starzynski, director and principal audio engineer at NBCUniversal, will present a brief overview of the latest developments before going on to examine specifics such as the space and acoustic limitations that can affect monitoring in OB trucks, channel restrictions on mixing consoles and different consumer platforms, including listening on headphones.

The ATSC 3.0 standard includes two audio codecs: MPEG-H 3D Audio and Dolby AC-4. Both are capable of delivering immersive sound plus a range of personalisation and interactive features. South Korea is the first country to have a broadcast system based on Ultra HD TV, conforming to ATSC 3.0 with MPEG-H 3D Audio. John Schur comments that trials of these technologies are also taking place in the US, where consumers and broadcasters alike are recognising their potential. "Some of the features of NGA are ones consumers are interested in," he says. "So there is motivation for broadcasters to adopt the technologies."

Interactivity is a feature of particular interest for a great many people, especially when it offers greater accessibility and practical benefits. "A specific use case is allowing hearing impaired or aging listeners and viewers to dial up the dialogue on a programme and bring down the effects," Schur says. "There is also the ability to select different commentaries for a sporting event. Broadcasters are able to monitor different playback devices and listening environments, such as headphone virtualisation."

Jim Starzynski, director and principal audio engineer at NBCUniversal.

Jim Starzynski, director and principal audio engineer at NBCUniversal.

Much of this is facilitated by the use of object-based audio and metadata, which are both key parts of the new breed of codecs and standards for Next Gen TV and NGA. In addition to checking the MPEG-H and AC-4 streams, Schur says operators also have to accommodate established audio systems. "We need to work out how to deal with monitoring and QC for traditional stereo and 5.1, which have been the primary broadcast formats for some time," he explains.

Schur observes that broadcasters have been limited in how they put together monitoring systems and tools for QC, with a long-standing reliance on technicians using their ears to check outputs for specific faults in conjunction with technology. "But now there are alternative languages, audio description and a need for tools to test for loudness compliance, silence detection and clicks and pops," he says. "NGA adds several new dimensions for monitoring and QC; just listening is not going to be enough any more. There are too many different ways the consumer can experience audio and it won't be possible [for technicians] to listen to all of those."

This, Schur says, is creating a need for "new and smarter" monitoring and QC tools to compensate for the reality that a human operator is now "not able to cover all the bases NGA has to offer". The different aspects involved in broadcasting true object-based audio, he continues, can take people by surprise and have not always been thought through. Because of this techniques are being "borrowed" from other technology areas that already demand a high degree of analysis due to their make-up.

"What we're talking about is AI, rule-based QC for any area that has a large amount of customisation and user interactivity, such as gaming," Schur explains. Modern video games involve multiple story paths and a variety of resolutions, all of which are available for selection as play continues. Broadcasters and programme producers are now considering this approach for non-linear TV platforms, as recently demonstrated by Netflix's Black Mirror: Bandersnatch interactive episode. Schur says broadcasting can learn from the monitoring and checking methodologies used to create a video game and apply them to TV production and distribution.

Not that Schur thinks a software only approach will fully supplant trained human operators carrying out QC processes: "There will probably be a combination of an operator and operator-assist tools. It's still very early days for NGA but ATSC streams are due to come on air later this year. Consumer NGA devices will be available next year, only with a limited set of functionality. We do have to start thinking about monitoring because there does seem to be some momentum building and we'll probably starting seeing more NGA features within three to five years."

An Objective Guide to Audio Monitoring for Next Gen TV takes place during the 2019 NAB Show on Tuesday 9 April from 2.30 to 2.50pm in Room N260.

John Schur, president of the TV Solutions Group at the ​Telos Alliance.

John Schur, president of the TV Solutions Group at the ​Telos Alliance.

Want to know more about this year's BEIT Conference? Click the link here to see the official schedule along with a snippet of information about each presentation.

Would a free exhibit pass help? Click this link or image below and enter the code MP01 at the correct prompt.

Need a free exhibit hall pass? Click on this link and enter MP01 when requested.

Need a free exhibit hall pass? Click on this link and enter MP01 when requested.

You might also like...

Understanding IP Broadcast Production Networks: Part 1 - Basic Principles Of IP

This is the first of a series of 14 articles that re-visit the basic building blocks of knowledge required to understand how IP networks actually work in the context of broadcast production systems.

System Showcase: Belgium’s RTBF Makes First Foray Into IP Production With OB Vans

In the Spring of 2019, Jean Vanbraekel, Head of Operations and Distribution for RTBF, was tasked with helping to move the French-speaking public broadcaster into the IP age and he was nervous. Not because he thought it couldn’t be done, b…

Machine Learning (ML) For Broadcasters: Part 11 - Generative AI In Content Generation

Machine Learning, under the banner of Generative AI, is encroaching on creative aspects of Audio and Visual content generation, even down to production of movie scripts and screenplays. This builds on established applications of ML lower down the production chain,…

Using Configurable FPGA’s For Integrated Flexibility, Scalability, And Resilience - Part 2

Software continues to demonstrate flexibility and scalability, but the new breed of software defined hardware architectures build on the success of software to keep latency very low and flexibility and scalability high.

Streamlining Operations The “Cloud-Native” Way Remains Key Theme At 2023 NAB Show

By virtualizing many of the key production tools and systems required to produce and distribute content, cloud-based production has emerged as a technology and service combination whose time has come. It’s already clear that the cost-effectiveness, flexibility, efficiency and s…