Virtual Production At IBC 2023

After several years of experimentation, movies and TV shows shot on LED wall based virtual production sets are quickly becoming the best and most economical way to create stunning virtual environments. Large stages and even mobile trailers have been outfitted with large LED displays on the floor, ceiling and of course the background, as actors can react to the live images better and the sets can be changed in a matter of hours without physically building anything.

This does not mean that traditional virtual sets, shot using a greenscreen, have been eliminated. There are still plenty of places (TV newsrooms, small production stages, etc.) and applications where a greenscreen based virtual set does the job well.

LED virtual sets are not just for moviemaking. ESPN, in Bristol Conn, has redesigned its flagship “Sports Center” studio (in celebration of 44 years on the air), and renamed it Studio X. It includes 38 million pixels of LED, including a massive 48-foot display (the largest on ESPN’s campus), along with a flexible anchor desk. To elevate the show’s production, ESPN turned to a depth monitor to bring dynamic virtual content into the space.

 ESPN has unveiled a massive LED production studio for its flagship “Sports Center” program that uses real-time processing and a massive 48-foot display.

ESPN has unveiled a massive LED production studio for its flagship “Sports Center” program that uses real-time processing and a massive 48-foot display.

Internally known as the “SportsCenter skydeck” and the “fifth wall,” the depth monitor helps create a virtual window for viewers. The effect uses virtual production technology to create a realistic parallax effect that is tied to the movement of studio cameras.

Interestingly, Studio X uses a single camera, but the crew has a lot of depth of field and space using a jib, providing a great perspective shot.

The technology relies on ESPN’s GRACE platform, which leverages Unreal Engine in a web interface that can also control DMX and live video inputs. This tech stack is also used in the sports channel’s “Catalyst Stage” in Studio C, which is ESPN’s dedicated virtual studio. They have also updated Studio Y, which houses a large chromakey cyc.

The depth monitor is focused on the “skydeck” view, which gives the illusion of being under a vast stadium. The scene includes the capabilities to adjust time of day, weather, lighting conditions, branding, and other factors through the GRACE platform.

What all of these uses have in common is a desire for high resolution images that make the sets more immersive for the viewer. And the technology to make it happen will be on display at the 2023 IBC Show in Amsterdam.

“What customers ask for is hyper realism, and of course, flexibility for creating different content for a variety of applications.,” said Miguel Churruca, Marketing and Communications Director at Brainstorm Multimedia (Stand 7.C46). “This means that the tools must ensure not only a perfectly realistic background scene, but also the ability to include data-driven graphics, control of other hardware such as studio lights, and the compatibility with broadcast workflows.”

These graphics are triggered by incoming data feeds for things like live election results, sports scores and even severe weather events.

“At Brainstorm we don’t see this as a battle between virtual sets and LED screens,” Churruca said. “On the contrary, it just provides more options to content providers to choose the right tool for their productions. Being in Virtual Production for decades, Brainstorm’s InfinitySet performs at its best both with virtual sets based on chroma or immersive LED screens, so it is up to the client to decide which method is best for a given solution.

“While many believe virtual production and XR is only possible with LED screens, the reality is that there is no technical reason why virtual content can’t be created using virtual sets, LED screens, or both at the same time,” he said. “It is a question of being aware of the pros and cons of each technology and choosing the best fit for the job.”

At the IBC Show Brainstorm will show the latest version of its InfinitySet platform (version 6) that is now compatible with Unreal Engine 5.3. A number of new virtual production features of InfinitySet will be demonstrated live, including the ability to combine chroma sets and shaped LED video walls, real-time extra-render set extension with color-matching 3D LUTs with color calibration tools, in-context AR motion graphics, fully immersive talent tele-transport, and multi-background content.

The latest version of the InfinitySet platform (version 6) that is now compatible with Unreal Engine 5.3.

The latest version of the InfinitySet platform (version 6) that is now compatible with Unreal Engine 5.3.

The combination of InfintiySet and Unreal Engine allows for full integration of objects created in InfinitySet or Aston within the Unreal Engine environment and vice versa, including shadows, reflections and AR with Unreal Engine.

“Not only can InfinitySet seamlessly control UE5 from its own interface, vastly improving the user’s ability to manage, edit and control UE blueprints, objects and properties, but it also makes the combination of the Unreal Engine render with Brainstorm’s own eStudio render engine fully transparent,” Churruca said. “This allows users to decide which objects or parts of the scene should be rendered in which engine, maximizing the possibilities of the software.”

In Stand 9.A05, Ross Video will show technology is capable of producing hyper-realistic virtual environments that enhance production quality and improve the viewing experience. To obtain the best results, they say a compelling 3D design with precision camera tracking and calibration is essential.

Ross also promotes that virtual production on a greenscreen set allows for a higher level of ROI. Users can deploy virtual solutions in small spaces that require fewer physical set pieces and less storage. They can also utilize “blended” environments that combine the best of traditional physical design and virtual design to eliminate the need for video walls and on-set monitors.

Ross Video’s Lucid Virtual Production Platform is powered by the company’s Voyager real-time graphics platform, which is built on top of Unreal Engine.

Ross Video’s Lucid Virtual Production Platform is powered by the company’s Voyager real-time graphics platform, which is built on top of Unreal Engine.

And of course, these virtual sets allow you to quickly change a set, deploy a new look, or share studio space without downtime for construction, moving set pieces, or competition for physical set space. To do this effectively, TV stations or production companies must create and train dedicated production teams that can execute a variety of tasks while producing a high-quality, lower-cost production on a daily basis.

“One of the biggest changes has been the use of Unreal Engine,” said Mike Paquin, senior product manager, Ross Virtual Solutions. “There have been huge improvements in the ability to create more realistic environments. This allows creatives to build amazing-looking sets that blur the lines between real and virtual.

“One of the biggest requests we’ve had was to support the latest Unreal Engine releases and we delivered that and continue to stay up to date.,” he said. “There has also been a huge push to do XR in many different sizes - some that are small add-ons to help tell a story, others that are looking for fully immersive sets without using the greenscreen to help presenters feel more aware of the content they are interacting with.”

Ross will show its immersive virtual set technology with an XR booth during IBC 2023. The company will show its Lucid Virtual Production Platform, which allows users to control all camera chains from one interface; quickly build and save tracking calibrations; trigger assets, camera moves, and more with Rosstalk triggers; and integrate newsroom and other data feeds in real time. The system is powered by Ross Video’s Voyager graphics engine, which is built on top of Unreal Engine.

“We are going to be showing some of the latest addition to Lucid and Voyager that expand our ability to control and build sets without adding complexity,” said Paquin. “Along with the LED walls from D3, we’ll be showing an XR+ demo tied in with data integration and Piero, our sports analysis solution. One cool thing we didn’t show at NAB that will be on display here is our presenter interacting and having control of the content while on the stage.”

In Stand 7.C26, ROE Visual will look to make its mark at IBC by unveiling new LED panels for Broadcast applications. A focus will be its Ruby and Ruby VP series of LED panels. The company has also developed unique 4in1 LED rental packages for live events, corporate presentations and production studios. Large video walls can be created with different configurations of ROE’s individual panels.

The ROE Ruby VP series are sturdy LED panels equipped with high-contrast black LEDs and are designed for broadcast or film studio environments.

The ROE Ruby VP series are sturdy LED panels equipped with high-contrast black LEDs and are designed for broadcast or film studio environments.

These displays, available in a number of pixel pitch sizes, provide ruggedness, less reflection and more contrast than previous versions, due to an optimized black body (and, the company says, the resulting colors are striking). Each panel includes advanced driver ICs and LEDs with large color space, making the Ruby displays fully HDR adaptive to add color depth and grayscales to the resulting visuals.

The company’s stand will also feature its latest version of the Ruby panel series, RB2.6, which offers a robust all-round LED panel with a high-contrast black LED. The 2.6-pixel pitch and wide color gamut provide brilliant visuals and it offers a wide viewing angle and reduced reflection, that pairs with a high bit-depth. It's high refresh rate of 7860Hz and scan ratio of 1/12 make it suited for rental as well as in-camera applications.

ROE Visual are an enabling technology partner for the GhostFrame system which uses very high refresh rate technology to enable the simultaneous delivery of multiple production streams within a single set. Their multi-source system displays multiple background images to be paired with multiple cameras to generate simultaneous live production feeds, alongside the addition of a hidden greenscreen feed which can be used for post-production. The system will be on demo on the ROE Visual stand at IBC. 

The Best Of Both Worlds   

Vendors at IBC will show LED wall stages, more traditional greenscreen sets and even new hybrid technology all have their place depending upon the application and budget.

“There are benefits to both and thankfully we offer both,” said Ross’ Paquin. “And there is a path to upgrade easily using the same control tool and re-using the creative elements. Greenscreens aren’t dead! There are many use cases where the greenscreen makes more sense than LEDs and vice versa. Why not do both!?

“We have the option to do Trackless Virtual where the camera moves are all done inside the render-engine to do simple productions. Then have a bigger greenscreen studio for a dynamic weather studio and a full LED studio for the primetime shows,” he said. “All of these studios can feed into the same production, and the set and creative are identical. This allows for a smaller LED volume or greenscreen cyc area to optimize real estate while still producing more high-profile shows.”

Brainstorm’s Churruca agrees that virtual production, in all of its forms, is having a significant impact on the new types of workflows and technologies being deployed.

“Virtual Production helps storytelling in many ways, facilitating the directors and DPs work,” he said. “Customers can have any scene as they wish, and for as much time as required. Now that real-time hyper realism is possible, we can shoot a car scene without closing an entire city for hours, shoot a sunset scene for a whole day, with the light, clouds and ambient as we may dream of, or bring a remote location to stage instead of sending a complete crew and actors to the other side of the world.

“And, when using chroma sets for virtual production, in combination with tracked cameras, the availability of real-time chroma keying and tracking information in layered recordings reduces the post-production time when additional tweaks are required. So, virtual production also reduces the time and budget required for post-production, while improving the results and enhancing storytelling.”

You might also like...

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.