Virtual Production Technology At NAB 2024

The relentless rise of virtual production looks set to continue at the 2024 NAB Show. Expect expansion and evolution of the ecosystem of cutting-edge technologies that make so many new creative possibilities a reality.

It is clear that virtual production is saving time and money for a wide range of movie and TV productions. Traditional greenscreen production will undoubtedly remain a strong presence on the 2024 NAB Show floor but it will be the use of LED wall based production techniques that will for many take center stage.

Virtual production’s capacity to use footage that can be shot anywhere on earth, or a 3D environment, as a backdrop to the studio has been fueling creative studio production for many years. Virtual production is not only about large LED walls, it can be created using smaller LED panels, chroma sets or with a combination of both. However, LED wall technology and techniques bring a few simple creative advantages; the actors and production crew can see and interact with the final scene, there is tighter integration between real and virtual lighting, and often a faster workflow that reduces the requirement for post-production.

For many sports broadcasters and production houses the ability to combine hyper realistic backgrounds with data-driven graphics is of paramount importance, and a key driver in the adoption of virtual production. The results can be highly compelling and it allows the use of a single application to create a variety of content.

It Comes Down To Choice

What customers at NAB will be looking for are flexible tools that allow them to create virtual content using the best technology for the job.

There are a range of key technologies that have helped virtual production gain significant momentum. Real-time graphics rendering engines (e.g., Unity, Unreal Engine and several vendor-proprietary solutions) are at the heart of virtual environment creation and sophisticated graphics production.

Motion tracking with cameras on set fitted with special capture hardware and special software often combined with robotic camera systems make it possible to synchronize camera movement across real and virtual spaces. Expect innovations from Vinten MMRC, MoSys, OptiTrack and Stype.

Signal processing vendors supporting color management and virtual tracking in this space include Arri, disguise, Ghost Frame, Grass Valley, Maxon, Rosco RDX Lab, Ross Video, Sony and Vizrt. They will all be on the NAB show floor with their latest technology.

As we walk the NAB Show floor we will be looking for new technologies which, for LED wall based systems, deliver greater and more streamlined integration between lighting and color within created 3D environments, on-set lighting from the LED wall & lighting fixtures, and camera technologies. In choosing the right LED wall, it’s important to understand that brighter is not always better, and low nit values can be necessary in order to balance the correct exposure between the talent and the scene. This often comes at the cost of spatially accurate ambient lighting from the wall. When shooting elaborate dynamic environments in volumes, the LED light is augmented with pixel mapped physical fixtures. Depending on the needs of the project, 2D render targets or 3D color intensity levels are closely mapped to the real-world space. This ensures that the talent, physical props, and set design integrate with the colors of the virtual environment in the frame. It is a continually evolving technology stack that is driving a creative unification of the real and virtual worlds.

AI in Virtual Production

Artificial Intelligence will be showcased in many booths at the NAB Show and it’s certain that virtual production will also see benefits from it. AI can be used to control cameras and adjust lighting during a shoot, reducing the need for human operators and enabling the production team to capture more footage in less time.

One of the most exciting developments in AI in production is the use of generative adversarial networks (GANs) to create entirely new content. GANs are a type of machine learning algorithm that can generate new images, videos, and sounds by learning from existing data. This brings the potential for creating new and innovative virtual reality experiences and interactive videos.

The Latest in LED Technology

In the area of LED displays, Roe Visual (Booth C4535), will show its Ruby LED series, featuring the Ruby RB1.2, RB1.5, and RB1.9BV2-C panels alongside the Black Marble LED floor, the BM2.

The Ruby RB1.2 is a fine-pitch, broadcast-grade HD-LED panel delivering high-performance visuals. The Ruby RB1.9BV2-C, a curved LED panel fully compatible with the regular RB1.9BV2, promises seamless integration for broadcast and virtual production applications. Designed with cutting-edge LED technology and high-speed components, the RB1.2 and RB1.9BV2-C panels offer true-to-content color representation and unrivaled in-camera performance with high frame rates, refresh rates, and minimal scan lines.

ROE Visual will also show its new Coral, a fine-pitch COB LED panel. The Coral's Chip-on-Board technology delivers high contrast, wide color gamut, and color accuracy for a high-definition viewing experience. Its energy-efficient common cathode design saves costs and extends the panel's service life.

Virtually Lots To See

Given all of the market pressures and budget requirements to do more with the same resources, virtual production is certainly here to stay. It just makes financial and operational sense. At this years’ NAB Show there will be lots to see to help professionals produce content in the most efficient way. Attendees are encouraged to research the various choices on the NAB Show floor, taking into account ease of installation, calibration and your preferred workflow.

You might also like...

The Meaning Of Metadata

Metadata is increasingly used to automate media management, from creation and acquisition to increasingly granular delivery channels and everything in-between. There’s nothing much new about metadata—it predated digital media by decades—but it is poised to become pivotal in …

Designing IP Broadcast Systems: Remote Control

Why mixing video and audio UDP/IP streams alongside time sensitive TCP/IP flows can cause many challenges for remote control applications such as a camera OCP, as the switches may be configured to prioritize the UDP feeds, or vice…

AI In The Content Lifecycle: Part 2 - Gen AI In TV Production

This series takes a practical, no-hype look at the role of Generative AI technology in the various steps in the broadcast content lifecycle – from pre-production through production, post, delivery and in regulation/ethics. Here we examine how Generative AI is b…

Managing Paradigm Change

When disruptive technologies transform how we do things it can be a shock to the system that feels like sudden and sometimes daunting change – managing that change is a little easier when viewed through the lens of modular, incremental, and c…

AI In The Content Lifecycle: Part 1 - Pre-Production

This is the first of a new series taking a practical, no-hype look at the role of Generative AI technology in the various steps in the broadcast content lifecycle – from pre-production through production, post, delivery and in regulation/ethics. Here w…