Real-Time On-Air Virtual Set Production Takes Off

The use of photorealistic technology is changing the way broadcasters are looking at virtual sets. Now it is possible to create scenes that are indistinguishable from reality, which provides excellent new possibilities for enhancing storytelling.

For live, on-air broadcast operation, photorealism is one of the toughest challenges.  What really makes the difference for viewers is being unable to tell whether the images they are watching are real videos or digital renders. Creating such realism requires the background scenes to have an extremely high render quality, which can only be achieved using hyper realistic rendering technology and carefully designed sets to create photorealistic images.

Game engines such as Unity or Unreal Engine (the technology behind the popular online videogame Fortnite) provide the ability to render in realtime opening up some fantastic possibilities.

Dock10, based at MediaCityUK in Salford for example has invested in a virtual set system driven by Unreal.

“It gave us the opportunity to get to grips with the technology and to build the team,” explains Andy Waters, Head of Studios. “It’s not just about buying the kit, sticking it in and turning it on. There’s a huge amount of knowledge needed to deploy the technology successfully.”

Waters points to the availability of inexpensive, ready made graphics assets which can be bought online, downloaded and ready to turn a new production around.

“We did a trial the other day where we could either have built a set of No. 10 Downing Street for ourselves from first principal or buy it online for $200.”

Where, until recently, broadcasters would milk a virtual set system for all its worth but end up tieing up one facility for use on one show, now the flexibility of games-engine driven systems means any space can be turned into a virtual set in a day.”

Dock10 plans to turn all of its studios into virtual-set ready spaces in a bid to attract more entertainment productions alongside existing news, sports and childrens programming users.

3D graphics developer Brainstorm created a unique approach called Combined Render Engine which pairs Unreal Engine with its own eStudio Render Engine to allow its graphics toolset InfinitySet not only to show excellently rendered realistic background scenes, but also to integrate broadcast graphics elements in the final scene. 3D data-driven motion graphics, charts, lower-thirds, tickers, CG and many other elements are not part of your standard games engine.

InfinitySet.

InfinitySet.

“With the inclusion of these graphic elements, the scene can result in a highly complex composition, seamlessly integrating in real-time different render engines, virtual 3D backgrounds, real characters and synthetic graphics elements - no matter if we are working in HD, 4K or even higher resolutions,” explains Miguel Churruca, Marketing and Communications Director of Brainstorm. “By using Unreal Engine, InfinitySet users also have access to the extensive library of assets Unreal provides to its community.”

Augmented Reality allows for displaying real footage or live videos mixed with virtual backgrounds or scenes, chroma-keyed talent and data-driven 3D graphics. The talent can interact with the environment for the audience.

Technology developer and producer The Future Group (TFG) recently claimed the first live broadcast containing real-time ray tracing and real-time facial animation. Ray tracing is an image rendering technique that can produce highly realistic CG lighting effects but it consumes huge quantities of rendering power, which is why it has only been used in postproduction for film and television. Using souped-up graphics cards and by blending CG with standard video frame rates, TFG animated and live streamed an augmented reality character interviewed by human presenter during RIOT Games League of Legends’ regional finals from Shanghai.

Brainstorm’s InfinitySet includes “industry-first” technologies such as 3D Presenter, which generates a real 3D object out of the talent’s live video feed, casting real shadows over the virtual set. Another feature, HandsTracking, allows presenters to trigger animations and graphics with the simple movement of their hands.

Interface for AR and Mixed Reality creative applications.

Interface for AR and Mixed Reality creative applications.

Its new TrackFree technology - a patented and “revolutionary approach” to Virtual Set production – according to Churruca, is an independent camera-tracking technology that provides total freedom for operators to use any tracking system, trackless or fixed cameras, or a combination of these at the same time.

The latest version of its InfinitySet takes advantage of the latest hardware developments of NVIDIA RTX GPU technology to deliver what Brainstorm claim is the best rendering quality available in the market.

“Using these technologies along with other advanced rendering capabilities like the Combined Render Engine with Unreal Engine, PBR materials, HDR I/O and the new 360º output, InfinitySet can create the most realistic content for virtual and mixed reality, virtual sets, real-time postproduction and film pre-visualization that can’t be distinguished from reality.”

InfinitySet also features a new module to enhance presentations by creating AR content using material from presentation tools, such as Microsoft PowerPoint or Adobe PDF, and including keyed talents.

Most broadcasters understand now that virtual set technology represents a huge cost-saving feature, both in broadcast and feature film production, allowing for real-time, virtual production with enough quality to be sent on-air right out of the box.

Churruca says, “Our guess is that new technological advances like virtualization, on-demand cloud rendering, enhanced integration with AR and, of course, an even more developed photorealistic quality with Unreal Engine or other game engines will be the key drivers in this area for years to come.”

Supported by

You might also like...

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Audio At NAB 2024

The 2024 NAB Show will see the big names in audio production embrace and help to drive forward the next generation of software centric distributed production workflows and join the ‘cloud’ revolution. Exciting times for broadcast audio.