Sports Graphics Production: The Rise Of The VP-AR Sports Studio

Live Sports Production has widely embraced Virtual & Augmented Reality techniques. Here we discuss the challenges of creating studio environments with a focus on camera motion tracking.

Virtual production is distinguished by its ability to do almost anything, but the real world of broadcast sports benefits most in two ways. First, hosts and pundits can seem to inhabit a stadium environment without leaving a conveniently-located studio, and second, creative data visualization can make the production both more informative and more immersive.

For a long time, the idea of being able to place computer-generated objects into real scenes, or to surround a real scene with a computer-generated environment, was reserved for prerecorded material in feature film and drama production. Live broadcasting has since inherited many of those techniques, creating a demand for 3D designers, technical and engineering people, and, crucially, tracking the position of the real world so that virtual objects and graphics, which bear a visually proportional relationship to physical objects (and people) in the studio, can be properly rendered.

For sports broadcasts, requirements for live data integration and real-time presentation create pressures that differ from otherwise similar work in post-production and visual effects. The symbiosis between technical and creative teams has never been more essential, and, as key industry players suggest, the best creative designs will always arise from that cooperation.

The Arrival Of Virtual Production

Duncan Foot, CEO of UK service provider MOOV, confirms that easier technology has made for an easier process, particularly on complex, virtual-environment jobs. “The tech has become more standard, and the energy goes more into the creative. In the early years there were huge numbers of discussions about how to do a virtual studio and it was like reinventing the wheel. As it’s a more established workflow you can get that out of the way. The tech has become… not day to day, but more so.”

Thom Stevens is Senior Director, Solutions Engineering for Graphics, Sports Data & Officiating at Deltatre, with eighteen years’ experience in live sports production both outside and inside the studio. “In the UK, we really saw an acceleration in the virtual and augmented space in 2011 and 2012,” Stevens begins. “BBC Sport, when they moved up to Salford, committed to taking on Augmented Reality as part of their studio expansion, and Sky Sports were doing the same thing.”

It Is Still A Physical Space

Live sports broadcasting has relied upon highly creative and skilled set design throughout its history and that remains an intrinsic part of production – even with a virtual production environment. Hosts and pundits exist in the real world and successfully placing them into a virtual environment still means creating a physical studio space with lighting, and cameras, and almost always furniture. The creative process of digital space design is discussed in the next article, but there is an integral relationship between people or physical objects and the virtual environment. As the camera moves the virtual environment has to move too and that requires camera tracking and synchronization of the real and virtual image.

Camera Motion Tracking

Camera tracking may involve any of several basic techniques. Sometimes markers are placed around the venue to be observed by a witness camera, referred to as an inside-out tracking system because the witness camera is inside the array of markers.  An alternative reverses that arrangement - an outside-in system - with cameras around the venue observing markers on the broadcast camera, much as with a performance capture system designed to record a moving actor. A full tracking solution will also involve encoding lenses, so that a computer can follow the camera operator’s framing and focus choices.

“When virtual systems first gained traction around 2011–12, several tracking providers emerged – many building on technologies from film production and engineering,” Stevens explains. “Beyond the famous reflective marker motion capture, mechanical systems used encoders to capture pan, tilt and zoom, even on cranes and jibs. Image-based tracking then evolved, calculating camera position from known visual features or pitch markings in sport. Very quickly, the ambition became minimizing physical hardware on site and shifting tracking further downstream, so rich virtual graphics could be created with minimal disruption to live production.”

“Today, that ambition is a reality in many workflows,” Stevens adds. “Modern robotic PTZ cameras often provide real-time pan, tilt and zoom data out of the box, and where they don’t, adding tracking encoders is straightforward. But the real advances have been in vision-based tracking software, where camera data is extracted directly from the broadcast feed itself. This approach is now widely adopted for virtual advertising graphics to be dynamically placed on the pitch – opening up new commercial opportunities without impacting the core broadcast.”

Studio setup, though, is just one task of many, as Stevens emphasizes. “If you’re talking about virtual and augmented realities, you’ve got to do everything from model design, rigging, integration into the real-time rendering engine… and that’s just look and feel. You still haven’t got to how it’s going to be controlled, how you’re going to put data into it and how it will work in a live environment.”

Duncan Foot, MOOV, with a quarter of a century in the business. “From a tracking point of view we generally use a system from a vendor we have become very familiar with so we can do all the lens calibration in-house. When we started doing this, we would need vendor support engineers. We have become very familiar with one vendor but we do also do other projects with systems from other vendors. Fundamentally a lot of them give the same results and they all have their quirks.”

A variation on inside-out systems brings some valuable additional workflow benefits - different vendors have their own names for it. It requires an LED video wall as opposed to a green screen. The tracking camera is configured to capture frames while the broadcast camera’s electronic shutter is closed, whilst the LED screen is synchronized to alternately display the intended scene whilst the broadcast camera shutter is open, and the tracking marks when its shutter is closed. As a result, the broadcast camera sees the scene, while the tracking camera sees tracking marks (in person, the screen appears to display a mix of both). The system requires a combination of high refresh rate LED displays with appropriate receiver cards, an LED Processor with appropriate software, and specific tracking camera software. There are different systems available from various vendors, each with their own specific capabilities.

“We use a solution with a higher refresh rate that brings about more capability.” Foot continues, pointing out that the technique can be used for things other than tracking. “We have a big LED screen at the Ealing Broadcast Centre, and that allows you to have multiple things within the same frame. You could, in one fraction of that frame, have a greenscreen in there which they could record on a separate channel.” If you have an LED display refresh rate with at least 4x that of the camera, that opens the potential to have four simultaneous elements captured; broadcast camera 1 seeing scene A, broadcast camera 2 seeing scene B, tacking data and green screen. In sports the combination of virtual green screen, camera 1 and tracking is most useful – the addition of a second virtual scene and a second camera is perhaps more drama oriented.

With each setup so dependent on a collection of technologies, the need for a steady technical hand is key. “We have everything in house, particularly virtual studio engineers,” Foot points out. “At Ealing Broadcast Centre, we have a permanent graphics room with all the renderers, the keyers and camera tracking systems. We have a keying engineer who will support it from an engineering point of view. We also have flyaway systems - when we do a virtual studio at Wimbledon for the BBC, we’ll integrate into the broadcast chain. We’ll take the camera, composite it, and send it back to the truck.”

Ultimately, Foot says, tracking is influenced by a lot of things. “There’s the physical space. With the virtual studio you’ve got two or three options - greenscreen, a giant LED immersive screen, or something where you’re adding virtual capacity to extend the feel of a studio. It can be driven by the space that’s available, it can be cost, creative aspirations. You might get some studios which have low ceilings, and you might use set extensions to change how the studio looks, but you may not be able to use certain kinds of tracking.”

In simple cases, Foot concludes, one-stop solutions exist, but discussion remains key. “At the low end you could get to the point where all your tracking is within a PTZ camera - you’ve calibrated everything, you’re in a greenscreen box, the lighting’s consistent and if the customer wants to come along tomorrow with their Unreal scene, that’s as close to plug and play as you’ll get. You’re then limited in the types of cameras you can use, but… you’re getting close to plug and play.”

Regardless, Foot firmly encourages “a dialog between all the partners, from both a creative point of view and a technical point of view. Quite often we’re brought in down the line after decisions have been made. Perhaps someone has chosen a greenscreen solution, where an LED volume would have worked better, or you realize greenscreen is very much alive for the right project. It’s not always that there’s wrong or daft decisions. Tech moves on all the time, things that might not have been valid a year ago are valid now.”

Control Technologies

Data driven visualizations to support expert analysis are a key element of live sports productions and their incorporation into the AR studio environment is a perfect fit. Since the days of drawing rudimentary rings around players, providing a means for pundits to interface with analysis graphics has been part of the technical requirement of many studio environments. We will delve into control in a later article but if the talent is to be given the means to control graphics that are part of the AR environment, either via tablet or touch screen, this needs careful planning and incorporating early into the physical design of the studio space and the virtual environment. 

Supported by

You might also like...

Big Chip Cameras For Broadcast: The History Of The Camera Sensor

Understanding the motivations and implications of using large sensors in broadcast, demands an examination of the historical relationship between cinema and broadcast camera technology & creative production techniques.

Immersive Audio 2025: The Rise Of Next Generation Audio

Immersive audio has gone spatial with the addition of height control and NGA formats to support it, and consumer demand is booming… but it is all still about the experience.

Live Sports Production: Exploring The Evolving OB

The first of our three articles is focused on comparing what technology is required in OBs and other venue systems to support the various approaches to live sports production.

Cloud Compute Infrastructure At IBC 2025

In celebration of the 2025 IBC Show, this article focuses on the key theme of cloud compute infrastructure and what exhibitors at the show are doing in this key area of technological enablement.

Monitoring & Compliance In Broadcast: Real-time Local Network Monitoring

With many production systems now a hybrid of SDI & IP networking, monitoring becomes a blend of the old and the new within a software controlled environment.