Creative Audio - Dolby Atmos With John Hunter

John Hunter holds a unique distinction among broadcast audio mixers: he was the A1 for the NBA Raptors vs. Pelicans basketball game in November 2018, one of the very first sports events to be produced in immersive Dolby Atmos sound with 4K HDR picture for live distribution to homes across North America.


More articles in the Creative Audio Series:


At the time, Hunter was full-time with Maple Leafs Sports and Entertainment (MLSE), the company that owns the only Canadian team in the NBA league, the Toronto Raptors, as well as the NHL’s Toronto Maple Leafs ice hockey team and the MLS’s Toronto FC soccer team. After 10 years with MLSE, eager to expand his horizons, he went freelance in 2019 to work as an A1 with Dome Productions.

“Since then I’ve been exposed to so many different shows,” says Hunter. In addition to still working on MLSE games, he handled the immersive audio mix for the final third of the equestrian cross-country course at the 2020 Tokyo Olympics and has also mixed tennis. “They mix tennis differently than a lot of other sports,” he says.

The 2018 Toronto Raptors-New Orleans Pelicans game at Toronto’s Scotiabank Arena leveraged the technological capabilities of Dome Productions, which had been involved in the first live 4K sports broadcast two years earlier. “It was definitely a learning curve for me, because all of our broadcasts had been in stereo before,” Hunter says. “But I’d done some training on 5.1 and some studio work in 5.1, so I wasn’t completely unfamiliar with it.”

Mike Babbitt, solutions engineering director for Dolby Laboratories, and Andrew Roundy, a Dolby staff live audio production engineer at the time, were on hand to help, Hunter reports. “They helped me get the console set up so that we could pass 5.1.4. We had to do some workarounds just to get the 5.1.4 monitoring correctly. With stereo, it’s pretty easy to just have two speakers at the same level. When you have 10 speakers it’s a bit more of a challenge. Mike was great, helping to align all the speakers so that they were at a good level.”

Creatively, Hunter had to think outside the box to deliver a new immersive experience to the subscribers tuning into the game. For instance, “We put ambience or crowd mics in new locations,” he recalls, adding four independent microphones set up as two stereo pairs to send to the overhead speakers. Those supplemented his usual four-mic setup — two crowd mics on the back of the baskets and two crowd mics from the game camera perspective — providing the center and width of the image.

“I like to take the approach that one microphone should try to capture one area and not have too many microphones overlapping, otherwise you get a lot of phasing,” Hunter says. “I’m really aggressive with one microphone at a time. I have several microphones on the court and I almost have all the other ones turned down, narrowly following the play. I find it cuts down on the echoing and phasing.

“You can get some really great sounds,” he continues. “If it’s a dunk, for example, and you just have the net mic open with the crowd, the dunk is so loud, and the crowd is erupting based on the sound. But of course, that’s all an illusion. If you’re at the game, you don’t hear the ball get dunked. So it’s sort of augmented or enhanced reality.”

At Babbitt’s suggestion, he says, he inserted some of the direct PA sound into the overhead speakers. “It has two effects,” Hunter explains. “First, it sounds like you are in the arena and you’re hearing the announcer in the PA above you.”

It also makes use of the precedence effect or law of the first wavefront, where a source’s location is perceived to come from the direction of the first sound to arrive at the ear, even if there are echoes or reverberations. “It has a psychoacoustic effect. It draws your ear away from all the reflections that are being picked up in the ambient and court mics and it cleans up your mix. So that was a really good tip,” he says. Hunter says he also rolls off his mics below 80 Hz to reduce interference from the PA.

John Hunter mixing Dolby Atmos.

John Hunter mixing Dolby Atmos.

Hunter has become good friends with Chris Phillips, the Raptors’ broadcast director, over the years. For later Atmos shows, he recalls, “Chris said, ‘I’ll let you know whenever I shoot a free throw with a handheld camera,’ which is underneath the basket. I had that net microphone on a fader, so whenever we went to that camera for a free throw, I could turn up that fader, so it only went to the overhead speaker. It sounded like the ball was going over your head, as though you were underneath the basket.”

Roundy offered another cool Atmos trick, Hunter recalls, to give some movement to the sound effect “swoosh” accompanying a replay. “He said you can double up that signal and delay it and roll off some of the high end, then route it to the rear speakers and the overhead rear speakers. You hear it through your front speakers first then it goes around your head on a delay.”

Hunter had understood that the Dolby Atmos mix for the Raptors-Pelicans game was initially going to be a test for the TV distribution partner (the stereo show was distributed by another partner). “Someone was going to sit in one of their studios and QC it. But there was a miscommunication and Mike [Babbitt] came running into the truck during the first quarter and said, ‘They’re putting this out on air!’ He jokes that I went from stereo to Atmos in 60 seconds. They were really happy with that mix and it was a really great learning experience for me.”

Hunter went on to mix about half a dozen more games in Atmos, he says, the last of them in 2019, just prior to the global pandemic lockdown. The trucks Dome provided for the shows housed different console models, which meant that he might have more faders under his hands at one game than at others. A couple of the trucks had additional software DSP integration, too. “So some trucks had a bit more flexibility,” he says. The DSP was really helpful: “I could take my mono camera microphone and make it stereo, or upmix my stereo group to 5.1. Or we’d do a downmix from 5.1 to stereo, because our show was still going out in stereo in Canada.”

One thing to watch when mixing in Dolby Atmos are the loudness levels of the simultaneous feeds in 5.1 and stereo that are being sent to the vast majority of viewers, he says. “Because you’re folding the mix back down to stereo, you really have to watch that you’re not overloading it. Your mix could sound great in the truck, but then if you’re pushing past -23 or -24 LKFS, the limiters downstream could be crushing your mix. I’ve heard that, where there’s a big play and my mix was just squashed.”

In 2019, Scotiabank Arena, which is home to the Raptors, installed a couple of array microphones more typically integrated into corporate boardrooms below the venue’s video display, about 40 feet above the court, to deliver a superior audio experience to premium suite and club ticketholders. “The last Atmos game I mixed, I had access to those and placed them in my overheads. They’re tuned to the action on the court, so they made the Atmos mix that much better,” he says.

When working in Atmos, Hunter says, he typically monitors the immersive mix 80% of the time.

“We’re trying to push the boundaries a bit, so that’s my focus. I know that as long as the announcers are balanced, and the effects and the crowd are sounding good, then my stereo mix is going to be pretty good, especially if I’ve dialed in my EQ and dynamics. I know where the levels need to be.”

That confidence stems from his early years behind a live sound console. “Before I started in sports, I would mix live music. I used to mix at an outdoor venue that did a lot of jazz and Latin jazz, carving out each signal from each instrument. You could hear every instrument and the vocal would sit just on top. I think that really helped me in my broadcast career, because I don’t want to cram as many sounds into the mix as possible; I want to make it a pleasurable, aesthetically pleasing experience for the listener.”

Ultimately, technology is just a tool, Hunter says, “I’m not excited by technology in and of itself. I’m a musician originally and I approach it like I’m playing or performing a song. It’s more about how this technology can be used to excite the fans at home. How can we deliver a more immersive experience?”

Video isn’t the entire experience of a show, he also points out. “While the images on the screen can be truly captivating, they don’t tell the whole story. It’s like watching a game in a bar with the sound off. I really think audio is so important to the emotion of the game.”

You might also like...

Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer

The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…

Designing IP Broadcast Systems: System Monitoring

Monitoring is at the core of any broadcast facility, but as IP continues to play a more important role, the need to progress beyond video and audio signal monitoring is becoming increasingly important.

Broadcasting Innovations At Paris 2024 Olympic Games

France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.