Real-Time On-Air Virtual Set Production Takes Off

The use of photorealistic technology is changing the way broadcasters are looking at virtual sets. Now it is possible to create scenes that are indistinguishable from reality, which provides excellent new possibilities for enhancing storytelling.

For live, on-air broadcast operation, photorealism is one of the toughest challenges.  What really makes the difference for viewers is being unable to tell whether the images they are watching are real videos or digital renders. Creating such realism requires the background scenes to have an extremely high render quality, which can only be achieved using hyper realistic rendering technology and carefully designed sets to create photorealistic images.

Game engines such as Unity or Unreal Engine (the technology behind the popular online videogame Fortnite) provide the ability to render in realtime opening up some fantastic possibilities.

Dock10, based at MediaCityUK in Salford for example has invested in a virtual set system driven by Unreal.

“It gave us the opportunity to get to grips with the technology and to build the team,” explains Andy Waters, Head of Studios. “It’s not just about buying the kit, sticking it in and turning it on. There’s a huge amount of knowledge needed to deploy the technology successfully.”

Waters points to the availability of inexpensive, ready made graphics assets which can be bought online, downloaded and ready to turn a new production around.

“We did a trial the other day where we could either have built a set of No. 10 Downing Street for ourselves from first principal or buy it online for $200.”

Where, until recently, broadcasters would milk a virtual set system for all its worth but end up tieing up one facility for use on one show, now the flexibility of games-engine driven systems means any space can be turned into a virtual set in a day.”

Dock10 plans to turn all of its studios into virtual-set ready spaces in a bid to attract more entertainment productions alongside existing news, sports and childrens programming users.

3D graphics developer Brainstorm created a unique approach called Combined Render Engine which pairs Unreal Engine with its own eStudio Render Engine to allow its graphics toolset InfinitySet not only to show excellently rendered realistic background scenes, but also to integrate broadcast graphics elements in the final scene. 3D data-driven motion graphics, charts, lower-thirds, tickers, CG and many other elements are not part of your standard games engine.

InfinitySet.

InfinitySet.

“With the inclusion of these graphic elements, the scene can result in a highly complex composition, seamlessly integrating in real-time different render engines, virtual 3D backgrounds, real characters and synthetic graphics elements - no matter if we are working in HD, 4K or even higher resolutions,” explains Miguel Churruca, Marketing and Communications Director of Brainstorm. “By using Unreal Engine, InfinitySet users also have access to the extensive library of assets Unreal provides to its community.”

Augmented Reality allows for displaying real footage or live videos mixed with virtual backgrounds or scenes, chroma-keyed talent and data-driven 3D graphics. The talent can interact with the environment for the audience.

Technology developer and producer The Future Group (TFG) recently claimed the first live broadcast containing real-time ray tracing and real-time facial animation. Ray tracing is an image rendering technique that can produce highly realistic CG lighting effects but it consumes huge quantities of rendering power, which is why it has only been used in postproduction for film and television. Using souped-up graphics cards and by blending CG with standard video frame rates, TFG animated and live streamed an augmented reality character interviewed by human presenter during RIOT Games League of Legends’ regional finals from Shanghai.

Brainstorm’s InfinitySet includes “industry-first” technologies such as 3D Presenter, which generates a real 3D object out of the talent’s live video feed, casting real shadows over the virtual set. Another feature, HandsTracking, allows presenters to trigger animations and graphics with the simple movement of their hands.

Interface for AR and Mixed Reality creative applications.

Interface for AR and Mixed Reality creative applications.

Its new TrackFree technology - a patented and “revolutionary approach” to Virtual Set production – according to Churruca, is an independent camera-tracking technology that provides total freedom for operators to use any tracking system, trackless or fixed cameras, or a combination of these at the same time.

The latest version of its InfinitySet takes advantage of the latest hardware developments of NVIDIA RTX GPU technology to deliver what Brainstorm claim is the best rendering quality available in the market.

“Using these technologies along with other advanced rendering capabilities like the Combined Render Engine with Unreal Engine, PBR materials, HDR I/O and the new 360º output, InfinitySet can create the most realistic content for virtual and mixed reality, virtual sets, real-time postproduction and film pre-visualization that can’t be distinguished from reality.”

InfinitySet also features a new module to enhance presentations by creating AR content using material from presentation tools, such as Microsoft PowerPoint or Adobe PDF, and including keyed talents.

Most broadcasters understand now that virtual set technology represents a huge cost-saving feature, both in broadcast and feature film production, allowing for real-time, virtual production with enough quality to be sent on-air right out of the box.

Churruca says, “Our guess is that new technological advances like virtualization, on-demand cloud rendering, enhanced integration with AR and, of course, an even more developed photorealistic quality with Unreal Engine or other game engines will be the key drivers in this area for years to come.”

Supported by

You might also like...

HDR - Part 12 - The Right Side Of Lenses

There are a number of reasons why people like old lenses, and they’re all very valid. Cameras and lenses so good they’re invisible are a recent development. Most of the best films ever made, by default, predate today’s spo…

Creative Analysis - Part 3 - The Road Bad & The Place Dark From The Cinematographer

Planning the cinematography of a production which is quite literally about darkness is a challenge. Shooting a documentary with a skeleton crew in a place where power cuts are every day is an even bigger challenge. Director of photography Miguel…

HDR - Part 11 - Large Format Viewfinders

The image of a director crouching to line up a shot with an optical viewfinder is one that’s been pushed aside somewhat by the less romantic modern image of a director squinting at an LCD monitor. The monitors have a…

Sports Re-Starts Without Fans And The Virtual Crowd Goes Wild

In mid-May of this year, as countries such as Germany, England, and Spain considered easing COVID-19 restrictions to allow professional sports to resume, various professional sports leagues began discussions with broadcasters and production companies on the best way to televise…

Creative Analysis - Part 2 - Penny Dreadful From The Cinematographer

For anyone who’s seen the first series to bear the title, the name Penny Dreadful will conjure up images of occult happenings in a shadowy, late-Victorian world. After twenty-seven episodes across three highly successful seasons, Showtime aired the last e…