HDR: Part 30 - Creative Technology - A Thousand Ways To Shoot

For most of its history, film and TV work has, by any sane measure, been incredibly complicated. Photochemical film was a nightmare of precision engineering and process control. Digital alternatives, intended to make things cheaper and simpler, involve some of our highest-performance electronics.

While going digital has certainly made things cheaper, though, it’s an open question whether on-set workflows are really any simpler – or if perhaps they could and should be.

There may not be many people reading this who have a lot of time on film, back when we didn’t need an extra technician to deal with the complexities of camera, recording and monitoring tech (some people would say we needed a first and second assistant, but it’s not like going digital obviated either of those roles). Still, there are plenty of people working behind the camera in 2021 for whom 35mm negative is a foreign concept. Those people might consider it crazy to describe the world of film as simpler, given the potential for undetectable problems to ruin a day’s work in a way that won’t show up until dailies.

Still, at some level it was more straightforward. Selecting a film stock, choosing lenses and filters and maybe one processing option was about as complicated as it got. Beyond that, it was about what we pointed the camera at. Monitoring was approximate because it could only be approximate. Camera setup more or less involved loading stock and selecting a frame rate. To be scrupulously fair, the work of a digital imaging technician, or a close equivalent, still had to be done, but it was done at the lab or the film manufacturer, or in telecine. Those are much less expensive, much more relaxed places to make complicated technical decisions than on a film set.

Keeping it Simple
Simple as film could be, though, during the move to digital, many people, including your narrator, argued that digital cinematography ought to be simpler. Certainly, that was the wide expectation: accurate monitoring, no need for awkward fumbling in a changing bag, immediate confidence replay, and vastly reduced costs. At the time, when only a few companies were involved, those hopes seemed reasonable. There weren’t a thousand ways to encode color and brightness, a thousand ways to compress images, and a thousand ways to monitor them.

In the mid-2000s, nobody was actively pushing the idea that digital cinematography would move a large chunk of the complexity of color control and processing on set. Nobody expected that it would require a permanent new addition to the camera department, and certainly nobody expected to then do much of the technical work again, in post. As evidence of that, consider the reality that early DITs were considered part of the editorial department, not that this stopped camera assistants telling them what to do.

Early in the process, it seemed that recording uncompressed DPX still image sequences would become standard. The excellence of uncompressed bitmaps seemed appropriate for technology that was attempting to replace film, and the simplicity of a list of pixel values beckoned a future in which compatibility would be straightforward – everything had supported files since the days of Kodak’s Cineon system of the early 90s.

Vendor Specific Image Processing
Quickly, though, cameras like Thomson’s Viper FilmStream camera began to introduce company-specific log-style encodings. At the time, it was a completely reasonable thing to do – there were no standards to follow – but in hindsight, this was the beginning of our collective journey to a point where every camera, monitor and piece of post-production software might demand several separate points of configuration.

Often, selecting the wrong lookup table will look obviously and horribly wrong, but there are also many opportunities to make subtle mistakes which manifest in ways that aren’t always obvious. The wrong brightness encoding. The wrong color encoding. Very slightly the wrong resolution. Too much compression. Too little compression, so we run out of places to store footage by lunchtime. These are insidious, tricky problems, not because the concepts beneath them are complex or that they’re individually difficult to control, but because there are so many of them and there is often no easy way to know if they’re wrong until later, when fixes are expensive.

It’s manageable, of course. Essentially every modern production copes with all this, a circumstance enjoyed by digital imaging technicians whose mortgage is currently being paid by the fact that necessary or not, we have to deal with the complexity. As to whether it’s actually technically essential or not… well. Clearly some of it is not. While modern digital cinematography probably still doesn’t quite have the tolerance for overexposure that film had, which makes certain creative decisions slightly more critical, most of the best-regarded films of the twentieth century were shot using either optical viewfinders or video tap images which were, compared to the monitoring output on a modern camera, of shockingly poor quality.

Questions Over Monitoring
The idea that it’s only possible to do good work with precise monitoring is therefore dubious in the extreme, and that’s if we accept the idea that precision monitoring is even possible in the variable conditions of a film shoot. Wandering into the DIT’s blacked-out tent from the searing brightness of a Los Angeles summer noon will not leave anyone’s visual system in a state suitable for critical picture evaluation. Regardless of the equipment in play or the skill and experience of the people, we almost never have access to the sort of precision that many people would like to believe we have.

Monitoring is a fairly trivial concern, though, in comparison to camera and post production software setup, which represent by far the lion’s share of the complexity. It’s an intricate and difficult sort of complexity, concerned with the precise way in which a camera records light and color. The die was cast, as we’ve seen, with Viper, where the manufacturer made decisions about how best to record the images based on in-depth engineering knowledge of the camera. There was nothing invalid about that, although it might be said to have inspired a new sort of secret sauce.

The idea that every camera design, every sensor technology, every color processing algorithm and every recording system has unique and special capabilities that requires a proprietary approach is attractive to a manufacturer keen to emphasize the specialness of its product. In a market which even now looks back to film as a benchmark of color and exposure behavior, cameras are widely promoted on their ability to capture scenes in a way that’s vaguely described – very vaguely – as “cinematic.” Exactly what relationship the word “cinematic” has with, say, the precise hue of the color primaries used to record an image is rarely discussed, but a per-camera approach still sounds enticingly likely to offer us something extra.

Subjective Measurement and Evaluation
But is all this complexity actually worth it, in terms of visibly better results? Strictly speaking, it’s impossible to tell. The precise relationship between the photons that go in the front of a camera and the signals that come out of the back are known only to the manufacturer, subject to exactly the sort of engineering decisions that companies prefer to keep under wraps. Factors such as dynamic range, noise and sensitivity are all related by a subjective judgment call about how much noise is acceptable. Colorimetry is a compromise between deep, saturated primary colors on the sensor, which give us an accurate view of the world, and paler, less saturated colors, which increase sensitivity. It’s complex. It’s just not clear how different camera color processing and recording systems really need to be in order to produce excellent results.

Even given all that complexity, some cameras behave sufficiently similarly that mistakenly using one camera’s LUT for a different camera works reasonably well. Some might say dangerously well, in that we might not notice the error. That suggests that at least some cameras might reasonably use a single, unified approach to color and brightness encoding, at least without sacrificing anything anyone would notice. From a technical standpoint, we know that many camera manufacturers buy sensors using very different technologies from a variety of third parties and have the entire camera range output to the same color spaces; if different underlying technologies really required different color and brightness encodings, that ought not to be possible.

The situation with compression codecs seems even less limited by technology. Often, approaches to in-camera compression are dictated as much by licensing and patent concerns as they are by engineering. Many column inches have recently been devoted to the dubiousness of the modern patent system and its tendency to protect ideas that are widely viewed as trivial or lacking originality, and reform here could make things a lot easier.

Do We Need DITs?
So, is all of the complexity that keeps DITs in a job (and your narrator speaks as an ex-DIT) actually, formally, technically necessary? Subject to all the caveats we’ve discussed, no, it probably isn’t.

What we’re talking about here is standardization, to the point where many – most – cameras could output files in standard formats, using standard codecs, applying standard color and brightness encoding to both those files and monitoring outputs. Let’s not forget, that actually was the case for decades, even before the ITU’s Recommendations 601 and 709 were formally specified. Television was television, even if it was all based on some rules of thumb derived from the performance of the earliest experimental setups. All cameras were designed to look good on the same displays. It’s another hint that something more universal is at least possible.

ACES Standardization
Is there any realistic likelihood that this will happen? Not really, no, for all the reasons of corporate interest that we’ve discussed. About the closest the world ever came to this was and is ACES, the Academy Color Encoding System, which is designed to bring all cameras into a common working space for much greater ease of use in post production.

The hypersensitivity of camera manufacturers to the idea that ACES might make every camera look the same speaks volumes about the scale of corporate intransigence that standardization would have to overcome. ACES doesn’t do that – it’s not intended to do that – and manufacturers are free to implement clever proprietary approaches to processing the sensor data all they like. ACES defines a way to describe a picture, not a way to relate that picture to a real-world scene.

While it’s a step in the right direction, not all cameras offer ACES modes and some which do may also have proprietary modes, often modes described (with varying degrees of accuracy) as raw recording, which provide extra features in supported post production software but which are somewhat orthogonal to ACES. Also, ACES does not talk about file formats and codecs, just color and brightness. Still, in the end, it’s about the closest we’ve come to the nirvana of effortless, universal compatibility in this field.

Learning from Television
So that’s where we are. In the last decade or so, we’ve seen not only a lot of encoding options in acquisition, but also in distribution. Rec. 709, specifying high-definition TV, was really just an incremental update on Rec. 601, which governed standard definition work. Adding in at least three major varieties of HDR has further complicated things on the output side. About the best we can reasonably hope for is that this period represents the same sort of chaotic genesis that existed in the early twentieth century, where a huge variety of film formats, frame rates and aspect ratios vied for acceptance. Perhaps we can hope that modern practice will similarly settle down to just a handful of options in the same way that we ended up with 35mm in flat or scope, two varieties of 16mm, and some esoteric large formats that were very rarely seen.

That sort of consensus will take some time to emerge. In the meantime, we’ll just have to deal with the chaos. Take heart, though; if this is what it takes to have 4K HDR on demand in everyone’s lounge, it’s hard to object in the long term.

Broadcast Bridge Survey

You might also like...