Virtual Production For Broadcast: Finishing

The goal with in-camera LED wall based virtual production is to capture the final image on set, to eliminate the need for compositing in Post. How much is left to finishing and grading?

Whether or not virtual production counts as a special effect or a visual effect is an interesting question, since it involves techniques traditional to both fields. Effects which happen in camera, on set, are generally special effects, and in that context virtual production is a special effect which should, in an ideal world, be ready for cutting directly into a finished production. There should be nothing to do in post production beyond the same sort of grading that any camera original material might need.

In anything other than an ideal world, the post production team might need to take at least some care to ensure virtual production scenes look as real as they should. The most involved post production work will usually be reserved for fixing problems which perhaps should have been detected and worked-around on set. As with so much of film and TV production, there’s a fine line between things which can be, or must be, addressed in post, and more complicated issues which are worth correcting on the day.

Matching Real-World Brightness

A perfect LED video wall would have enough brightness such that its brightest white highlights would cause the camera to create its own brightest possible white. Given that, highlights in the LED wall appear clean and properly-rendered. However, modern cameras often have very high dynamic range, so that may not be possible. Many background images displayed on the LED wall will still contain at least some small highlights, however, and these must match the highlights in the real-world scene. Otherwise, the background image risks looking dull, with clipped greyish highlights, and identifiably not a real scene.

Often, this problem will disappear when a fairly normal grade is applied to the image. Grading for a standard dynamic range finish involves managing the dynamic range of the camera, which is often at least fourteen f-stops, to fit into the dynamic range of a standard dynamic range display, which is often less than ten. This inevitably means extremes of brightness and shadow are reduced, often making the peak whites of both the real and virtual scenes similar enough that the problem vanishes without too much special attention.

Sometimes, especially when grading for a high dynamic range delivery, it might be necessary to adjust the brightness of the virtual scene, and particularly its highlights, to match the real world scene. The best approach will depend heavily on what’s in frame, although it may still be handled in grading, perhaps using shape isolation to control just that part of the frame which represents the relevant area of the video wall. In particularly difficult cases, more advanced visual effects techniques might be required where LED wall highlights interact with the foreground scene in complex ways. With proper technique, problems that severe, and that complex, ought to be rare in practice.

In marginal situations, virtual background images, whether rendered or photographed, can avoid clipped highlights and therefore avoid or minimize the problem, though the LED wall must still be bright enough for reasonable exposure. In general, an appropriate monitoring LUT which compresses highlights in a manner closely approximating the final grade can give some confidence that the results will be as intended.

...And Real-World Shadow

Maintaining realistic black levels is a problem which has existed since the popularity of back projection. White projection screens were inevitably prone to pick up ambient light from the scene, raising the black level of the projection to levels which could affect the success of the shot. Virtual production is much less prone to the same problem because the video wall is mostly black. Even so, at least some part of the wall surface is made of reflective plastic LED emitters, and so it’s still possible to suffer black level problems if light is allowed to spill onto the screen.

The best mitigation is therefore to avoid light spilling onto the screen, which is why gaffers working with virtual production will ensure there are lots of black flags on hand. Some setups make this more practical than others. A heavily-backlit sunset might make it easy to avoid light falling on the screen; a day exterior under directionless overcast, perhaps created by an array of overhead soft lighting, might be more difficult to control.

Beyond just flagging light off the screen as much as possible, anticipating and avoiding black level concerns might involve similar techniques to those used to preview white level issues. It shouldn’t be necessary to deliberately crush shadow detail in order to fix a black level problem, but an appropriate monitoring LUT which applies a look reasonably close to the intended final grade will make issues easier to spot, and then to avoid. Normal grading or (in rare, complex cases) visual effects techniques might be used to deepen shadows in the same way they can be used to brighten highlights.

Extending The Backdrop

Many kinds of virtual production will involve tracking the camera, and recording that data may make it easier to add conventional effects in post production, a hybrid of in-camera and post-production effects work. This works well where, for instance, a wide-angle lens, adventurous camera move or extreme angle might reveal the edge of the LED wall. Where a fully three-dimensional virtual environment is in use, it might even be possible for the virtual production facility to render and record a full-frame image, accurately aligned to the camera perspective, at the same time as the live action material is recorded. That image is then available for straightforward compositing later.

There are downsides. The composited parts of the frame were not present in reality, as the LED wall is, so they don’t add to interactive lighting of the scene. Similarly, they haven’t been re-photographed by the taking camera, so that any real-world lens effects will be missing. Where foreground elements obscure the background, a solution (often manual rotoscoping) must be found to composite them against the newly added background. The flawless compositing of difficult transparent or reflective subjects common to virtual production is therefore missing in those areas.

Still, this approach can make it much easier to show larger scenes than the LED wall itself can accommodate, potentially making significant savings on scenes which would otherwise demand a much larger facility for just a few shots. The need to do some visual effects work to create a really convincing result can often be well worthwhile.

Avoiding Problems

As with any special effect, the best ways to avoid the need for post production fixes is in camera. Virtual production can, if not properly handled, create some classes of problem which might require a more complicated visual effects intervention in post. One trivial example might involve a less-than-ideal setup which places a subject too close to the LED wall, which might end up too sharply focussed, revealing the pixelated nature of the backdrop.

Fixing that might demand the addition of more blur in post. More serious problems might involve poor camera or lens tracking, improper characterization of lens distortion, or geometry errors in the screen setup. Some of these are fundamental to the physical arrangement of the virtual production facility itself. Others will involve collaboration of the facility with the camera team, but all of them are detectable given careful observation on a large, high-quality monitor.

That’s particularly true when a cinematographer is evaluating the match between the LED wall and any other lighting involved. Here, the color quality considerations of most LED walls can conspire to create a situation where the match as viewed on a monitor looks very different to the match which exists in reality. Some cinematographers have preferred to isolate themselves from viewing the stage by eye, using black flags or retreating to the DIT’s tent to concentrate solely on how the scene looks to the camera. Mismatched lighting, tracking or geometry issues may be difficult to fix later.

The Big Idea

Well-shot virtual productions should demand little or no more post work than any other conventionally-shot scene. That is, in the end, the whole purpose of using an in-camera effect. It requires moving much of the creative and technical effort from post production to pre production, which inevitably affects workloads and scheduling. Even so, the potential to walk away from a shooting day with finished shots featuring advanced effects is enormously valuable to a production team concerned over the potential for conventional visual effects budgets and timescales to slip.

At the time of writing, virtual production had been in widespread use for two to three years, meaning it should have moved out of the earliest, least-reliable phases of deployment. Even now, though, virtual production setups represent a stack of technologies which will always demand a collaboration between different disciplines. Get that right, and post production demands drop to near-zero, and the director has an unprecedented freedom to shoot complex shots which might challenge a conventional visual effects pipeline. In the end, that’s the promise of virtual production – and as techniques and technologies mature, it should become easier to fulfil.

You might also like...

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.