Virtual Production For Broadcast: Image Based Lighting
Ensuring consistency of lighting between the virtual world and physical objects on set requires controlling production lighting based on image content.
All 12 articles in this series are now available in our free 54 page eBook ‘Virtual Production For Broadcast’ – download it HERE.
All articles are also available individually:
Video walls are not generally sources of high color quality light. So, it’s sometimes necessary to reinforce that light with conventional production lighting, though naturally, that lighting must react to changing conditions in the virtual world. Using a rendered image to control production lighting adopts the video game and CGI technique called Image Based Lighting.
One of the key advantages of virtual production is that the LED wall can be sufficiently powerful to cast light on real-world, foreground objects. Like a lot of virtual production techniques, though, interactive lighting is related to approaches which have been used since the dawn of cinema. Fire might be simulated with a flickering orange light, or the passing street lights of a night drive with conventional production lighting, moving past a car that’s really stationary.
Image Based Lighting
Normally, that kind of lighting trick might be achieved with no more technology than someone hand-holding a light. Even before the availability of virtual production in the current sense, video displays had been used to cast light on a scene, particularly in 2013’s Oblivion, where front-projected backgrounds were allowed to light entire scenes. At around the same time, Gravity used LED video wall panels, but not to create a convincing background image. The panels of the time lacked the necessary resolution, at least at practical screen sizes. Instead, the cast were surrounded by LED panels intended to cast convincing interactive light for spectacular, dizzying space scenes.
LED walls have now improved enough to provide both background images and interactive light. The result recalls image-based lighting which has long been used in video game and visual effects development, where a 360-degree photograph of a location is used to control the fall of light on objects. The LED wall does this by its very nature, but additional production lights can be controlled by the image rendered for a virtual production stage too. When there’s a spectacular purple and amber sunset in the sky, overhead lighting might automatically turn purple and amber to match it, without needing an overhead video wall.
The Compromises Of Virtual Light
One reason we might want to use additional production lighting, as opposed to relying entirely on the LED wall, is color quality. Many people are familiar with color quality metrics such as CRI and TLCI. CRI was designed a long time ago to evaluate home and workplace lighting, and is often not a sensitive enough test for cinematography. Regardless how we measure, though, projected and LED video displays invariably score very poorly on all metrics of color quality, meaning that real-world subjects - particularly people - may not look as we would expect.
This happens because effectively all LED video walls are built using red, green and blue LED emitters. While they might seem to combine into white light, measuring the result with a color meter reveals that the light only looks white; it still has only three peaks of red, green and blue, as opposed to the continuous spectrum of (say) sunlight. No correction or calibration can solve this problem; it’s the same issue which afflicts very low cost lighting intended for entertainment venues and live events, which may also have solely red, green and blue emitters. As a result, any object which reflects saturated colors other than bright shades of red, green and blue may appear muddy and poorly-illuminated.
Production Lighting
Production lighting may use red, green and blue LEDs, but it will also include other types. Commonly, white emitters use an underlying blue emitter to make a yellow-emitting phosphor glow, creating a more continuous spectrum. Many other configurations are found in quality production lighting. Many production lights have remotely-controllable color behavior, which may allow the color and brightness to be controlled automatically by video images in the same way as image-based lighting in a visual effects environment.
That can work for a single, point-source LED emitter, which might simulate sunlight, something LED walls often struggle to do. While bright, they can’t generally produce a small spot of light that’s bright enough to illuminate the scene as the sun would. Overhead lighting such as spacelights might change color to match a sky. Possibly the most advanced expression of image based lighting in virtual production, though, involves a technique which has sometimes been called pixel mapping. With pixel mapping, production lighting, sometimes in tube format, is made of individually-controlled pixels an inch or two square.
Assembling an array of tubes creates, in effect, a low resolution video display which can be controlled in much the same way as the LED wall itself. That can create more realistic reflections and more accurately simulates the complex fall of light on the scene, with excellent color quality, and in a way that has never been possible with conventional post production visual effects.
Connectivity & Control
Film and television production lighting invariably offers DMX as at least one of its supported control protocols. A simple DMX connection may be enough to control a single light intended to simulate, say, sunlight. Where a large array of pixel-mapped lights are used, the number of control channels needed quickly exceeds what’s practical using conventional lighting control systems.
Some lighting control consoles are capable of working with video material in order to derive the control signals for the light from the image itself. In other circumstances, particularly where lots of pixel-mapped lighting is in use, a computer-based media server will be responsible for creating DMX control signals based on image content. Approaches which send DMX data over Ethernet connections are often used, especially on computer-based control systems, sometimes called media servers.
Details of the process for controlling any light using a video image varies depending on the specific hardware involved. Usually, there will be facilities to control a single light based on a single point in the video image, or to average an area of the image to control one or more lights. Pixel mapping means defining an area of the video to be displayed on the pixel-mapped lighting. That might mean scaling down the video, since the array of pixel-mapped lights will invariably be much lower effective resolution than the LED wall itself.
Color & Brightness
For all these techniques to be meaningful, the color of the light must match the color displayed on the LED wall. This is not always easy, since, as we’ve seen, the LED wall and the lighting devices, while both LED-based, generally don’t use the same emitter types. The underlying technology is often very different.
Solving this problem means addressing two issues. The first is colorspace. Most people understand that a color image relies on red, green and blue components. A colorspace defines exactly what shades of red, green and blue we’re using, which controls which shades are possible; for instance, there can be no blue that is a deeper blue than the blue emitter turned on, while the red and green are off. The colorspace used by video signals are generally specified in documents such as the ITU-T’s Recommendation BT.709, among others.
The second issue is brightness encoding. Most production lighting is photometrically linear, so that movement of a control slider is (roughly) proportional to the number of photons that come out. To reduce exposure one stop we reduce the control signal by 50%. That’s not how video gamma encoding works, for reasons connected to noise and the behavior of legacy display technologies.
Production lights intended to be part of image-based lighting setups may support various brightness encoding standards and colorspaces. Ideally, an LED video wall and the nearby lighting technology will all be capable of using the same standards, meaning that data can be extracted directly from the video and sent over DMX to the lights. In reality, that ideal may not always occur, and there may be some manual adjustment involved.
Image Sources
Image-based lighting must have images from which to derive its control signals. In most situations, two images appear at once on a virtual production display. First, the system establishes exactly where the camera is, where it’s aiming, and how its lens is set up, and it renders a high-resolution image of the virtual environment onto the wall in the appropriate area. Second, the rest of the wall, which will not be seen directly on camera, is filled with a slightly simpler, static approximation of the surrounding area, intended to create appropriate reflections and fall of light on real-world foreground objects.
That static surround will generally be the source of data for image-based lighting, changing as the scene in general changes (and not, necessarily, reacting to the motion of the taking camera). This approach works well when a light is controlled by part of the image which appears on the LED wall. Some lights, however, might need to be controlled by information that isn’t part of the LED wall image, such as a sky where there is no overhead LED wall. In this situation it is sometimes possible to ask the virtual production facility to render a specific video feed, from an upward-facing virtual camera, to create lighting control data.
In Practice
Image-based lighting has the potential to create interactive lighting effects which further enhance the already-impressive realism of virtual production. Particularly, the option to use conventional production lighting devices which are intended to produce light of high color quality can avoid the compromises of LED wall light, ideally creating a best of both worlds that’s both convincing as an effects tool, and ensures foreground subjects always receive flattering, accurate light.
You might also like...
Designing IP Broadcast Systems
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G
The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.
Designing IP Broadcast Systems: Addressing & Packet Delivery
How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.
Next-Gen 5G Contribution: Part 1 - The Technology Of 5G
5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.
Designing IP Broadcast Systems: Integrating Cloud Infrastructure
Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.