Virtual Production Set To Be A Major Theme At 2023 NAB Show

In the area of virtual production, the times have certainly changed. From the early days of shooting against a green screen and compositing the image in real time, the biggest productions are now using large “volume” stages where actors are filmed in front of giant wrap around LED screens to capture the end result in camera.

Sometimes referred to as in-camera visual effects (ICVFX), these new production techniques are allowing filmmakers to substitute the use of green screens and hours of post-production to digitally produce realistic environments, visual effects, and background scenery. Even local TV stations in ever increasing numbers are replacing the green screen wall and shooting against LED screens. Some say it offers better control and resolution for the graphics on screen.

Indeed, the M&E industry is quickly evolving to widespread adoption of virtual production technologies to enable remote workflows, enhance visual graphics, manage digital assets, and streamline production processes.

“Virtual production is maturing very quickly, and, although many clients are identifying virtual production with LED videowalls, this technology is much more than just using large videowalls to display content,” said Miguel Churruca, Marketing and Communications Director, Brainstorm Multimedia.  “To create virtual production content, we need to use 3D hyper realistic scenes and integrate actors or presenters in the scene. This has been a common task for years, using chroma sets, but the incorporation of large LED walls to the equation has resulted in an explosion in demand for such content.

“LED screens provide several advantages compared to green screen production – and some disadvantages as well,” he said. “In practical terms, many clients are now in the process of understanding not only the technology but also the possibilities that it brings to content creation."

He added that while some months ago many thought that virtual production was just using a large videowall with background pictures, now they understand that proper virtual production must resemble virtual worlds, “therefore using tracked 3D backgrounds is a must, and taking good care to ensure the backgrounds are realistic enough to fool the audience is also essential.”

At the 2023 NAB Show, attendees will see a significant amount of virtual production technology, both the green screen form, with augmented reality graphics and real-time 3D effects, and traditional style (and highly portable) green screen based systems. In fact, virtual production will be a major theme among many vendors that produce graphics systems and large LED screens.

Key to any real-time virtual production volume stage is a compositing system that generates a lot of horsepower, like Epic Games’ Unreal Engine.

In Booth N2639Brainstorm will showcase its latest developments in virtual production and motion graphics. In the virtual production field the company will demonstrate, in the main theatre, several complex virtual production workflows using Unreal Engine 5, with in-context and on-top AR data-driven graphics, set extension with color matching, multicamera XR and many other features.

Production company Hi-End, in Spain, created this virtual set with a Brainstorm InfinitySet for EA Sports that was streamed live.

Production company Hi-End, in Spain, created this virtual set with a Brainstorm InfinitySet for EA Sports that was streamed live.

This will include a combination of Brainstorm’s InfinitySet and Aston Broadcast’s video graphics systems (which is owned by Brainstorm). The latest version of InfinitySet blends advanced virtual set and augmented reality tools that integrate into any broadcast workflow and environment. InfinitySet acts like a hub for a number of technologies, from tracking systems to interaction with other devices, controllers, mixers, and chroma keyers.

“What our clients are asking for is a flexible solution that can not only create excellently rendered backgrounds, but also team up with other solutions and include AR graphics and other visual aids that are essential in the broadcast environment,” said Churruca. “Other applications such as live events, drama or film have different requirements, but being able to fulfil them with a solution like InfinitySet is a great advantage.”

Chyron (Booth N2647) will show that green screen production is not a lost art, The company will highlight its PRIME VSAR system, which merges Unreal Engine with Chyron’s broadcast design, editorial, and live production capabilities. PRIME VSAR supports virtual environments in a green screen studio, augmented reality elements on a physical set, and tracked virtual set extensions feeding directly into video walls.

Chyron’s PRIME VSAR virtual production system combines greenscreen capture and real-time effects processing.

Chyron’s PRIME VSAR virtual production system combines greenscreen capture and real-time effects processing.

PRIME VSAR includes the Cesium tracking processor that operates on a host machine without additional hardware and supports all industry-standard tracking protocols. Meanwhile, the internal Mercury Panel provides a visual, simple-to-operate virtual camera control interface for trackless applications (accessible via a web browser).

The virtual production system supports two cameras on a single render engine, offering SDI connectivity, HD & 4K-UHD format support, and HDR video. For confidence-monitoring of every take in a production, users can dedicate a PRIME VSAR engine to generating real-time SDI previews of every virtual scene. A native primitive library includes 3D text and objects with data integration for dynamic graphics, while weather forecasting and virtual screen templates simplify building a virtual studio environment. In addition, traditionally complex AR scene operations now function as easy-to-manage scene assets.

Ross Video has dedicated significant resources to its graphics systems, as they relate to virtual production. In Booth N2201, the company will highlight the latest software features for its Voyager, an Unreal Engine-based render platform and Lucid Studio, a “control center” for virtual—or eXtended Reality (XR)—productions.

Ross Video’s Voyager uses the Lucid Studio control platform as its front end and doesn’t require the operator to be an experienced Unreal4 user or designer.

Ross Video’s Voyager uses the Lucid Studio control platform as its front end and doesn’t require the operator to be an experienced Unreal4 user or designer.

Voyager version 4.27 includes several new templates that make it easier to set-up a virtual LED environment as well as a simplified workflow for hybrid environments where standard augmented reality elements need to be running alongside the virtual LED studio environment.

In addition, the latest release (6.3) of Lucid Studio includes a variety of new features that are making virtual productions easy to manage. At NAB it will be shown with an improved user experience, thanks to a new web API and features that enhance the mobile experience for event control from a phone or a tablet.

With the acquisition of D3 LED (in 2021), an indoor and outdoor LED maker, Ross now offers customers large LED screens (and managed control) for volume stages.

The Trend To Follow

Virtual production, regardless of whether it is based on chroma sets or LED videowalls, may represent significant cost savings while greatly enhancing the creativity of producers and designers. The possibilities of having hyper realistic virtual worlds as the background scene of any production, in real-time, are endless, and if we add graphics into the equation, the final result will be highly engaging.

“The possibility of combining LED walls and chroma sets for virtual production allows creative teams to tele-transport talents from anywhere in the world to the home studio, or vice versa, which adds further possibilities for remote interviews and connections,” said Brainstorm’s Churruca. “Using the Cloud should be the next step, as with many other areas of the broadcast production, however there are still issues to be solved, especially those regarding real-time and latency.”

What’s clear is that virtual production, in all of its forms (including volumetric video capture), is here to stay.

“Virtual production in some form has been around for 100 years,” said Brian Nowac, Founder and CEO of San Francisco-based motion picture technology company Magicbox. “It’s not a new idea, it’s just a new way to do it. I think what is going to make the use of the LED wall stand out as a prominent method of doing virtual production for decades is not the acceleration of production time, or the reduction of cost (although it can be significant), but I believe it’s the ability to expand creativity almost infinitely. Anything you can imagine you can digitally create and put up inside that LED volume, and instantly have a live action production take place in that location.

“That kind of power is tremendous for everyone involved,” Nowac added. “This unrestricted creativity is what’s going to keep the LED volume in the production toolkit for a long time.”

You might also like...

The Meaning Of Metadata

Metadata is increasingly used to automate media management, from creation and acquisition to increasingly granular delivery channels and everything in-between. There’s nothing much new about metadata—it predated digital media by decades—but it is poised to become pivotal in …

Designing IP Broadcast Systems: Remote Control

Why mixing video and audio UDP/IP streams alongside time sensitive TCP/IP flows can cause many challenges for remote control applications such as a camera OCP, as the switches may be configured to prioritize the UDP feeds, or vice…

AI In The Content Lifecycle: Part 2 - Gen AI In TV Production

This series takes a practical, no-hype look at the role of Generative AI technology in the various steps in the broadcast content lifecycle – from pre-production through production, post, delivery and in regulation/ethics. Here we examine how Generative AI is b…

Managing Paradigm Change

When disruptive technologies transform how we do things it can be a shock to the system that feels like sudden and sometimes daunting change – managing that change is a little easier when viewed through the lens of modular, incremental, and c…

AI In The Content Lifecycle: Part 1 - Pre-Production

This is the first of a new series taking a practical, no-hype look at the role of Generative AI technology in the various steps in the broadcast content lifecycle – from pre-production through production, post, delivery and in regulation/ethics. Here w…