Real-Time On-Air Virtual Set Production Takes Off

The use of photorealistic technology is changing the way broadcasters are looking at virtual sets. Now it is possible to create scenes that are indistinguishable from reality, which provides excellent new possibilities for enhancing storytelling.

For live, on-air broadcast operation, photorealism is one of the toughest challenges.  What really makes the difference for viewers is being unable to tell whether the images they are watching are real videos or digital renders. Creating such realism requires the background scenes to have an extremely high render quality, which can only be achieved using hyper realistic rendering technology and carefully designed sets to create photorealistic images.

Game engines such as Unity or Unreal Engine (the technology behind the popular online videogame Fortnite) provide the ability to render in realtime opening up some fantastic possibilities.

Dock10, based at MediaCityUK in Salford for example has invested in a virtual set system driven by Unreal.

“It gave us the opportunity to get to grips with the technology and to build the team,” explains Andy Waters, Head of Studios. “It’s not just about buying the kit, sticking it in and turning it on. There’s a huge amount of knowledge needed to deploy the technology successfully.”

Waters points to the availability of inexpensive, ready made graphics assets which can be bought online, downloaded and ready to turn a new production around.

“We did a trial the other day where we could either have built a set of No. 10 Downing Street for ourselves from first principal or buy it online for $200.”

Where, until recently, broadcasters would milk a virtual set system for all its worth but end up tieing up one facility for use on one show, now the flexibility of games-engine driven systems means any space can be turned into a virtual set in a day.”

Dock10 plans to turn all of its studios into virtual-set ready spaces in a bid to attract more entertainment productions alongside existing news, sports and childrens programming users.

3D graphics developer Brainstorm created a unique approach called Combined Render Engine which pairs Unreal Engine with its own eStudio Render Engine to allow its graphics toolset InfinitySet not only to show excellently rendered realistic background scenes, but also to integrate broadcast graphics elements in the final scene. 3D data-driven motion graphics, charts, lower-thirds, tickers, CG and many other elements are not part of your standard games engine.

InfinitySet.

InfinitySet.

“With the inclusion of these graphic elements, the scene can result in a highly complex composition, seamlessly integrating in real-time different render engines, virtual 3D backgrounds, real characters and synthetic graphics elements - no matter if we are working in HD, 4K or even higher resolutions,” explains Miguel Churruca, Marketing and Communications Director of Brainstorm. “By using Unreal Engine, InfinitySet users also have access to the extensive library of assets Unreal provides to its community.”

Augmented Reality allows for displaying real footage or live videos mixed with virtual backgrounds or scenes, chroma-keyed talent and data-driven 3D graphics. The talent can interact with the environment for the audience.

Technology developer and producer The Future Group (TFG) recently claimed the first live broadcast containing real-time ray tracing and real-time facial animation. Ray tracing is an image rendering technique that can produce highly realistic CG lighting effects but it consumes huge quantities of rendering power, which is why it has only been used in postproduction for film and television. Using souped-up graphics cards and by blending CG with standard video frame rates, TFG animated and live streamed an augmented reality character interviewed by human presenter during RIOT Games League of Legends’ regional finals from Shanghai.

Brainstorm’s InfinitySet includes “industry-first” technologies such as 3D Presenter, which generates a real 3D object out of the talent’s live video feed, casting real shadows over the virtual set. Another feature, HandsTracking, allows presenters to trigger animations and graphics with the simple movement of their hands.

Interface for AR and Mixed Reality creative applications.

Interface for AR and Mixed Reality creative applications.

Its new TrackFree technology - a patented and “revolutionary approach” to Virtual Set production – according to Churruca, is an independent camera-tracking technology that provides total freedom for operators to use any tracking system, trackless or fixed cameras, or a combination of these at the same time.

The latest version of its InfinitySet takes advantage of the latest hardware developments of NVIDIA RTX GPU technology to deliver what Brainstorm claim is the best rendering quality available in the market.

“Using these technologies along with other advanced rendering capabilities like the Combined Render Engine with Unreal Engine, PBR materials, HDR I/O and the new 360º output, InfinitySet can create the most realistic content for virtual and mixed reality, virtual sets, real-time postproduction and film pre-visualization that can’t be distinguished from reality.”

InfinitySet also features a new module to enhance presentations by creating AR content using material from presentation tools, such as Microsoft PowerPoint or Adobe PDF, and including keyed talents.

Most broadcasters understand now that virtual set technology represents a huge cost-saving feature, both in broadcast and feature film production, allowing for real-time, virtual production with enough quality to be sent on-air right out of the box.

Churruca says, “Our guess is that new technological advances like virtualization, on-demand cloud rendering, enhanced integration with AR and, of course, an even more developed photorealistic quality with Unreal Engine or other game engines will be the key drivers in this area for years to come.”

Supported by

You might also like...

KVM & Multiviewer Systems At NAB 2024

We take a look at what to expect in the world of KVM & Multiviewer systems at the 2024 NAB Show. Expect plenty of innovation in KVM over IP and systems that facilitate remote production, distributed teams and cloud integration.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Comms In Hybrid SDI - IP - Cloud Systems - Part 1

We examine the demands placed on hybrid, distributed comms systems and the practical requirements for connectivity, transport and functionality.