Virtual Production For Broadcast: Shooting Locations For Virtual Production

Sending out a crew to capture a real-world environment can be a more straightforward option than creating a virtual world, but there are some quite specific considerations affecting how the material is shot and prepared for use.

The best-known applications of virtual production have generally been those using real-time renders of a huge, detailed three-dimensional scene. Allowing the camera to explore that world with the convenience of an in-camera effect is a large part of the attraction of virtual production.

At the same time, some productions don’t need a computer-generated world to show what they need to show. A driving montage might need to depict the same street used for exteriors cut into the same sequence, and sometimes virtual production is used to avoid travelling to locations which are entirely real – just far away or otherwise inconvenient.

Fundamentals Of Plate Photography

The idea of using live-action footage as the background for a shot dates back to the earliest days of cinema, with in-car shots of the 1940s notorious for their wobbly backgrounds. Modern stabilisation techniques make it easier to escape the wobble, but some of the same considerations still apply. Lens effects including flare, distortion, softness and geometry errors should ideally be absent from a background plate. No image is ever entirely free of those things, but in a perfect world the plate should be as clean and sharp as possible, with creative lens and filter choices left to the taking camera in the final setup.

That’s complicated by the fact that virtual production stages are often configured to cover a very wide field of view. As such, shooting live-action environments for them will often demand more than recording a simple, single-camera background plate. One of the existing 360-degree camera systems could be used, although the high resolution and contrast requirements mean that only the very best are likely to be suitable. More often, plates for virtual production are often shot in much the same way as plates for high end visual effects, using an array of high-end cinema cameras configured so their images can later be combined – stitched – to create a single image.

Cameras And Lenses

The requirements for angular coverage and resolution are a geometry problem, depending on the resolution of the LED wall and the choice of camera position and lens setup. Some of the world’s most impressive virtual production stages have very high resolution – often in the tens of thousands of pixels in the horizontal direction. That’s only likely to increase as LED video wall panels themselves improve, mainly because tighter LED spacing reduces the likelihood that individual emitters become visible.

So background plate shoots are likely to choose high-resolution cameras, potentially highlighting issues which we might not notice in the course of normal shooting. Among those considerations are diffraction limits, which can limit resolution in counter-intuitive ways. A 12K camera with a Super-35mm sensor is diffraction limited above about f/4, meaning that any narrower aperture may start to create softness. With wider apertures, meanwhile, depth of field limits might require the cinematographer to choose where to set the focus for proper integration into the real-world foreground when the material is later used as a backdrop.

It’s also common to choose wide angle lenses to photograph background plates, maximising the coverage per camera at least as much as is required for the proposed setup. Generally, rectilinear lenses will be chosen, avoiding the distortion of a fisheye, although all lenses have at least some distortion which may need to be corrected in software later (see stitching). Wider-angle lenses may also be more likely to suffer corner softness and other geometric distortions. High-performance modern lenses minimise those problems, and are often used for plate photography even when the final shot will use a beloved and characterful lens of history.

For similar reasons, it’s normal to record plate photography in a high-quality format. How critical this is depends on the proposed final setup – a shot which will be seen out-of-focus will demand less than one in which we want to be able to recognise at least some detail. Some LED walls are set up to render colour according to the ITU’s Recommendation ITU-R BT.709, commonly “Rec. 709.” Recordings from high-end cinema cameras invariably contain more colour and brightness information than typical Rec. 709 displays can handle, so conventional camera log formats for background plates will usually capture enough information for good results. Grading, as necessary, is likely to be part of the stitching process which combines multiple recordings into a single image. Background plates will usually be processed for a straightforward, naturalistic look, since the finished scene will be graded again once it’s been shot on the virtual production stage.

Rigging Multiple Cameras

Where more than one camera is used to shoot a plate, proper rigging is essential to fix the cameras in their relative positions. Good rigging makes it easier to combine the multiple resulting files into a single image. That will always require a certain amount of post-production adjustment and some degree of overlap between cameras so that the seam can be made invisible. The amount of overlap needs to accommodate changes to the image shape due to adjustments for lens distortion; too small an overlap can create complicated problems.

A number of mechanical configurations are possible. Generally, the aim is to place the cameras as close together as possible and minimise errors due to parallax, which are especially visible when shooting nearby objects. The effects of parallax reduce as the distance between the camera and the subject increases. For applications such as aerial photography, the subject may often be so distant compared to the spacing between cameras that parallax errors are negligible.

Depending on the physical configuration of the cameras and the available space, toe-in or toe-out configurations might work best. A toe-out setup has the cameras pointing outward, whereas toe-in places them aiming inward at a notional point in front of the lens such that their fields of view cross over. Toe-in setups can put the cameras as close together as is allowed by the front diameter of the lens, which is often a comparatively narrow part of the camera.

It may sometimes be easier to mount some cameras upside down with respect to others, and in principle the orientation can be corrected when the images are combined. With rolling shutter cameras (which includes several popular high-end cinema cameras) there may be subtle changes to image geometry when the camera or subject is in motion, and that change depends on the scan direction of the rolling shutter. Similar problems exist when shooting stereoscopic 3D where one camera may be upside down with respect to the other. As such, wherever possible, the safest choice is to keep all the cameras in the same physical orientation.

The key benefit of modern technology is stabilisation, and rigging cameras for plate photography will mean creating an arrangement which can be mounted on an active stabilisation device. That might mean a straightforward handheld gimbal for the simplest jobs, all the way up to a vehicle-mounted stabilised remote head for more advanced work. However the camera or cameras are rigged, the limits of the stabilisation system will control how much weight can be carried, and how large the overall camera setup can be.

Stitching

The mechanical rigging of a multi-camera system will never be pixel-perfect, and some post production effort will be required to assemble the multiple files into a single image. Stitching large plates can be a big job for post-production workstations, often involving long sequences of high resolution images recorded in demanding, high quality formats. The process isn’t enormously complicated compared to advanced visual effects, but it is not trivial, and last-minute changes may be difficult. Background plates for visual effects are often only finessed in regions which will be seen in the final shot, whereas virtual production backgrounds must be perfect throughout as it is not known what will eventually be seen. Images are easier and faster to stitch if they’re well-shot, and some post production processes may be able to make use of lens grids, which are charts shot to allow analysis (even automatic analysis) and correction of the distortion of a lens.

Playback

Playing back such large and demanding sequences can be a challenge. Many virtual production facilities will have done so before and will be able to advise on how to format and supply the background plate for easy playback. That might involve splitting the image up to be played back by several servers in parallel, or other prerequisites. Because of varying playback and display requirements, even where a production can source its background plates from a library, there might be some work involved in reformatting and preparing that plate for the job in question.

Shooting with a single camera removes the need for stitching, and high resolution cameras with wide angle lenses have been developed with exactly this in mind. Considerations such as distortion, flare and reformatting remain, though, and the single sensor has limited performance in comparison to the much larger total sensor area of a multi-camera array.

Live-action background plates are not a panacea and shooting them well is not a trivial task. Still, they’ll often reduce the workload massively compared to creating a fully three-dimensional virtual scene, even to the extent of making virtual production practical where it might otherwise not have been.

You might also like...

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Comms In Hybrid SDI - IP - Cloud Systems - Part 2

We continue our examination of the demands placed on hybrid, distributed comms systems and the practical requirements for connectivity, transport and functionality.

KVM & Multiviewer Systems At NAB 2024

We take a look at what to expect in the world of KVM & Multiviewer systems at the 2024 NAB Show. Expect plenty of innovation in KVM over IP and systems that facilitate remote production, distributed teams and cloud integration.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…