NEP Productions is one of the companies that has remained flexible with its workflows to accommodate changing safety conditions.
Using lessons learned from its bubble experience during the 2020 pandemic season, the National Basketball Association (NBA) continues to reimagine how it broadcasts games to its viewers while also carefully returning to traditional production workflows that have served its TV coverage well. However, there’s no doubt that various REMI methods are allowing teams to produce NBA content with fewer production personnel and fewer technical assets located onsite.
COVID protocols established two years ago also remain in place, with frequent testing and vaccination mandates. Production teams have implemented several extra precautionary layers on-site, including assigned seating in mobile units and workspaces/zones that ensures proper distancing and keeps people in specific working areas; improved air filtration, deep cleaning of all work surfaces and equipment, packaged and/or plated catering, proper transportation to/from hotels/site and much more.
The NBA is aired on the ABC broadcast network as well as cable outlets ESPN and TNT. Mobile production companies helping to bring the action to air include F&F Productions and NEP Productions US., using a 1080p HDR workflow.
Developing A Lasting Road Map
In some respects ESPN still uses two primary production workflows that helped it get through the worst of production-related times—an Enhanced World Feed (EWF) and Graphics REMote Integration (GREMI).
Graphical elements like the virtual Shot Clock on the floor were developed during the lockdown years and even before.
The EWF model relies heavily on collaboration and communication with all 30 NBA Regional Sports Networks (RSNs). ESPN takes the host RSN’s “clean” feed and adds various layers of production on top of the feed, including commentary, graphics and camera angles.
“We worked closely with every RSN heading into the season to learn what they’re able to provide, to communicate our expectations, and to understand each other’s goals and technical capabilities,” said Chris Calcinari, ESPN Senior Vice President, Remote Operations.
For EWF broadcasts, ESPN’s full production team, including the game producer, director, audio personnel, graphics and replay teams are located at ESPN’s Bristol, Conn. or Charlotte, N.C. studios—all fully masked and socially distanced.
The GREMI Model Is Now REMCO
The GREMI model is one ESPN has used for several years and really jump-started the concept of remotely producing a live sports game. It’s an evolution of ESPN’s REMI model and features a combination of on-site and remote (or studio) production elements. For a GREMI broadcast, the ESPN graphics team, along with its Enhanced Video Support (EVS) crew, contributes to the telecasts from its Bristol or Charlotte studios by controlling hardware that sits in the on-site mobile units, while the bulk of the production team and the commentators are generally working at the arena. This includes the game producer and director.
“These two primary production methods have served us well thus far, and we are constantly evolving our approach,” said Calcinari, adding that they are regularly implementing lessons learned and strengthening best practices as required.
Regular REMI-based workflows that include EVS replay systems and REMCO workflows, which use Vizrt graphics, are now centered in Bristol and Charlotte. REMCO, for remote collaboration, is the new remote-graphics solution term now used in place of GREMI.
Still Working From Home
Other innovations learned from experience include the use of home studios that have enabled talent to contribute to various shows and events from the safety of their homes. This includes home commentary systems that provide talent with the proper audio, on-camera and monitoring (to see the program feed, stats and each other) tools to call games or host shows from home; as well as a full suite of production tools that allow directors, producers, and technicians to produce full shows from home. Many of the staff have returned to on-site production trucks, but there are less people and mobile units on site than in years past.
During the worst of the pandemic, fan noises were piped into live broadcasts and display monitors around the court to help simulate the feel of the crowd.
During the worst of the pandemic, in 2020, when crowds were not allowed in arenas and synthetic fan noises were piped into live broadcasts, the NBA was projecting live fans on 17-foot video boards surrounding the basketball court—via Microsoft’s Azure cloud service. Fans who participated were brought into a virtual interactive experience developed by Microsoft called “Together Mode.” This gave these virtual fans the feeling of sitting next to each other at the game.
It was also enhancing audio elements with microphones embedded in the floor and deployed around the benches, court, and backboards of the bubble arena (at ESPN’s Wide World of Sports facility in Florida). Camera enhancements then included rail cameras that glided along the court at floor level, with dozens of Panasonic PTZ HD robotic cameras deployed on top of, behind and below the backboards, in locker rooms, hallways, and at practice facilities.
Fans (And New Technology) Return
With fans now back in attendance, the rail camera, most of the PTZs and those fans on monitors have been taken away (to avoid blocking valuable seats), but many of the mic placements and innovative floor graphics have remained. Along with offsite resources, ESPN is using its normal camera complement (about 20 cameras per game) that includes shallow–depth-of-field cameras for tighter close-ups.
In many arenas across the country the NBA has also upgraded existing four-cable SkyCam systems into a “SpyderCam” that has improved maneuverability thanks to a more high-powered trolley system.
On the streaming front, users of NBA League Pass and NBA TV (on the NBA App and NBA.com) continue to receive customized viewing options including alternate camera angle feeds, mobile-friendly graphics, interactive gaming, and the ability to view the game with alternate commentators specializing in the worlds of analytics, fashion, or music.
What’s clear is that the goal is to create an enjoyable and immersive experience where fans can engage with the game more closely than ever before. The lessons learned have gone a long way to producing a live game more safely and comprehensively with less resources than before. Using distributed architectures and the ability to bring in virtual resources from anywhere in the world, basketball coverage has never been so efficient or visually pleasing.
You might also like...
NMOS has succeeded in providing interoperability between media devices on IP infrastructures, and there are provisions within the specifications to help maintain system security.
What we’ve seen as ATSC 3.0 deploys and develops is just the tip of the NextGen TV iceberg.
Thus far we have looked at transforms from a somewhat abstract viewpoint. In contrast, here we look at an application where transforms take center stage.
Broadcasters are experimenting with many new TV business models to monetize new NextGen TV technologies.
It’s quite incredible to think that real time broadcast signal workflows are now actively encouraging software processing. It wasn’t so long ago that video images had to be processed in hardware to meet the tight timing constraints that liv…