Virtual Production For Broadcast: Designing The Virtual World

It is true that some of the key tools of virtual production are well-established in the world of computer entertainment, but the design constraints can be very different, demanding photorealism over smaller areas, as well as staging and layout that’s suitable for the proposed scene.

The best-known virtual productions put people in enormous, spectacular environments full of imaginative production design. The least famous ones might recreate more everyday situations, albeit for equally important reasons. The work of creating those environments might employ artists in computer suites as opposed to construction workers in hard hats, but the virtual art department must do the same concept, design and construction effort that it always did.

Building The World

The specific usage details of various pieces of software are too huge a subject to cover here, although much of what we discuss will apply to Epic Games’ Unreal Engine, which is commonly used for virtual production, as well as other real time rendering engines like Unity. Most of it also applies to visual effects software which are more usually used for offline rendering.

Creating an environment designed to be viewed through the virtual window of an LED wall, with both smooth performance and photographic realism, is a fairly new discipline. Design for virtual production will use some skills common to visual effects and game development, though the technical and creative considerations for a virtual art department are not the same as either.  Several of those skills need more coverage than we can fit in this brief overview, and we’ll cover them in detail later.

Scoping The Project

To the relief of producers everywhere, not every virtual production needs an elaborate three-dimensional environment built from scratch. The right approach will depend on exactly what the production needs to shoot, and that’s a decision best made in consultation with senior creatives, experienced virtual production people, and the facilities that might be involved.

Using a live-action background plate shot on location can look convincing from a limited range of angles. Unlike a full 3D environment, using a plate will limit where the taking camera can go, but can be less work. Since some LED walls can be tens of thousands of pixels across, shooting and compositing adequate plates may not be trivial. The process of doing that is discussed in more detail separately.

What’s perhaps more common than either a fully three-dimensional world or a two-dimensional background plate is a hybrid approach, where some live action material is used as part of a possibly simplified three-dimensional world. Sometimes that’s referred to as “two-and-a-half D,” combining some of the time and work benefits of a background plate with at least some of the freedom of movement as in a fully three-dimensional world.

The Construction Process

Many virtual productions will need at least some new 3D assets, whether that’s a complete world and props to go in it, or simpler shapes to give a rough form to a live action plate. While there are crucial creative differences between the demands of virtual production environment creation and either VFX or games, many of the software tools are the same, ultimately building objects from meshes of triangular polygons and wrapping those meshes in images to define colour and texture.

There are many ways to model the shape - the geometry - and assign a material to 3D assets. Many of them will be familiar to anyone with a little experience of modern visual effects. Geometry might be based on manual assembly of simple shapes, scanning of real environments with laser scanners or by taking a series of photographs to be interpreted by a computer. Materials are often based on one or more images which might be painted manually, photographed from reality, or a combination of the two. Scanning real objects can be a quick and easy approach, though there are some caveats.

However an asset is created, the virtual production stage must reliably render its images at the frame rate of the taking camera, and the performance of the rendering servers has limits. So, assets must be designed for high performance, taking care that exactly the right amount of detail is included. Even though most virtual production facilities use the very best available hardware, there are still limits on the number of polygons and the resolution of textures in any scene.

Material Versus Geometry

Realistic objects are a combination of two things: realistic geometry with realistic materials. Both must have enough detail to look convincing to the taking camera, given the way they are used in the virtual environment - where they are, how large they are in frame, how sharply-focussed they are, how they’re lit, and other factors.

Very often, the geometry of an object can be fairly rough if its materials are good enough. Texturing techniques such as bump mapping and normal mapping can simulate fine surface detail on an object. Bump mapping, for instance, uses a black-and-white image to indicate small variations in the surface of an object, with white areas appearing raised. This can work well enough to reduce the need for fine geometry.

Recent developments in real time ray tracing, which was previously available mainly to non-real time rendering systems, have made more accurate materials and lighting possible, particularly where a material might need to reflect its environment.

Off-The-Shelf Assets

There are already libraries of environments and objects available very affordably for game development and visual effects. Geometry and materials may be packaged in a variety of different formats, often intended for use with different software. Some of those formats can be converted to others, although sometimes some manual intervention will be needed from experienced people.

Until now, game development has not routinely been able to target photorealism. Certain recent titles have been able to achieve it under some circumstances, and things change quickly, but many assets built for games won’t have enough fidelity or even the right art style to work in virtual production. Equally, assets built for visual effects work may look good, but have too much detail to be rendered in real time. Again, qualified people may be able to work on an asset to make it more usable, although the practicality of that depends on the specifics of the asset and the intended use.

Lighting

Experience in visual effects makes it clear that lighting is key to making things look both realistic and appropriate to the production. Directors of photography are likely to become involved in planning and constructing virtual environments, both because layout will naturally affect blocking and framing, and because the lighting of the virtual world must be compatible with the live action scene.

For much of the history of video game technology, the lighting tools available to designers were limited, with restrictions on the type and character of light available and the way that light interacts with objects. For years, objects in virtual worlds didn’t even cast shadows, or when they did, the shadow might not look entirely as it should.  Soft light sources - even something as simple as overcast - could often only be approximated as a directionless ambient light. Sometimes, that sort of lighting might have been pre-calculated, or baked, which may take time and restrict the way that particular light source interacts with objects in the world.

Current systems are much more sophisticated, and many of the tools of real-world cinematography can be simulated. There may still be some restrictions on specific techniques, particularly around large, well-simulated soft lights and the total number of them, but in general there are equivalents for most of the types of lighting cinematographers might need.

Virtual Scouting & The Virtual Backlot

Even before a virtual environment has been finished, rough or approximate versions of the layout might be available for viewing. The pedigree of virtual production in the world of video games often makes it possible to explore that newly-created environment just like a game, perhaps with a virtual reality headset or on the LED wall itself, in a process that’s sometimes called virtual location scouting. Often, that can take place using a personal laptop or workstation, or using video streamed from a remote location.

Once a virtual production environment has been finalised, its assets can potentially be stored for later use - whether the same production might need to return to a virtual location, or if the same assets might be useful for something else. That might even extend to scanning physical sets for later use in virtual production. Just as there are already commercial libraries of virtual assets, and just as there are already physical backlots and prop stores, anyone owning a particular set of virtual production assets might choose to create a virtual backlot, potentially making more sophisticated environments available to a wider range of productions.

Best Use Of The Virtual Art Department

Productions which might have enthusiastically adopted virtual production are sometimes concerned about the workload of creating a virtual environment. It should be clear that not every show will need a complete, from-scratch, three-dimensional environment, and that there are many valid approaches to virtual production that can keep the technology neatly out of the way of the creative process. Several of the subjects touched on here affect both the art and the science of virtual production, and we’ll explore them in depth in upcoming chapters.

You might also like...

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.

Designing IP Broadcast Systems: Part 3 - Designing For Everyday Operation

Welcome to the third part of ‘Designing IP Broadcast Systems’ - a major 18 article exploration of the technology needed to create practical IP based broadcast production systems. Part 3 discusses some of the key challenges of designing network systems to support eve…

What Are The Long-Term Implications Of AI For Broadcast?

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.