HDR: Part 13 - Cameras And The Zero-Sum Game

For a long time, selecting camera gear has been fairly easy. For twenty years, digital cinema cameras have never quite had everything we wanted, and the choice often boiled down to comparing the compromises. That’ll always be true to a degree, but for the last year or two it’s felt like we’re arriving somewhere. We can’t have anything, but we can have more than enough, and those compromises are boiling down to a zero-sum game.

What does that mean? Well, make a bigger-chip camera for lower noise and we end up using longer lenses to achieve the same field of view. Longer lenses magnify things more, so out of focus areas look more out of focus, which is where we get the idea that longer lenses reduce depth of field. So, we might have to stop down to get to the same depth of field, to make accurate focus achievable. Stop down, though, and we’ve darkened the picture, so we might have to select a higher sensitivity. Higher sensitivity on a digital camera just means gain, which increases noise, compromising exactly the things we were trying to achieve with a bigger sensor to begin with.

There are other solutions. We can simply add more light, though that’s an expensive approach that might be a big ask of a production that’s already paying for a big-chip camera. And it’s not even particularly effective; double the amount of light and you’ve bought the focus puller a stop, which is welcome, but not enough to offset the difference in depth of field between a super-35 and full-frame sensor in otherwise equivalent circumstances. Quadruple the amount of light and – well – that’s two stops. That’s great, but it’s well beyond the ability of most productions to do.

Optical Quality

The vicious circle continues in other ways. Instead of improving noise, we might want a bigger sensor to improve resolution without having to accept a noise penalty. The problem is, to achieve, say, f/2, a lens must (at the very least) have a diameter that’s equal to half its focal length. But the focal length is longer, for an equivalent field of view, on our larger sensor. The lens must therefore be larger – which is intuitive enough – to achieve the same f stop at the same field of view and with the same quality. That bigger lens will be considerably more expensive. If it isn’t sufficiently expensive, the optical quality might begin to suffer, compromising the improved resolution we wanted to begin with.

That’s a zero-sum game. A combination of practicalities that leave us with a certain maximum level of image quality. No matter which compromises we choose, we’re trading one thing off against another.

Larger Sensors

Going any further down the road of bigger sensors probably isn’t an idea with much of a future. More light and better glass helps, but costs can only escalate so far. There certainly doesn’t seem to have been too much of a push for even larger sensors in mainstream cinematography. There have been digital medium-format camera backs – generally not covering the full medium-format frame, but still very large – which will shoot video, and perhaps it’s only a matter of time before Imax steps in with a sensor the size of a 15-perf 65mm negative. The purpose of Imax is not really restraint or moderation, after all. Still, yet bigger chips sort of thing seems likely to stay in its specialist niche.

Is there a better solution? Sure, though it applies to every imaging sensor ever made. All you have to do is find a way to make each square micron of sensor area more sensitive to light, without compromising anything else.

Improving Sensitivity

That is what sensor manufacturers R&D departments spend their days trying to do. One key figure of merit is “quantum efficiency.” Ideally, a photosite on a sensor would capture every photon that struck it and convert that photon into an electron. Real world designs are not quite that perfect. Equally, modern sensors have built-in hardware which converts the numbers of electrons captured, which is fundamentally an analog signal, into a digital signal. Doing that creates noise we’d prefer wasn’t there. The reality is that a competitive sensor in 2020 can record light levels up to a few tens of thousands of photons per frame, per photosite, with a handful of electrons in read noise.

Naturally, we’d like more, which is where scale helps. The simplest way to improve things is to make the photosite bigger so more photons are likely to hit it and more will fit in it, which demands a bigger sensor for the same resolution. We’ve done that, though. Now we have to find a way to achieve higher sensitivity without just scaling things up; to break that zero-sum game.

Various approaches to doing that have been tried. One is to maximize the amount of the sensor that’s covered in photosites, minimizing the extra circuitry that’s around each one. This is why many sensors have a rolling shutter, as the extra electronics for global shuttering take up more space. We can also make more room for photosite area by separating the photosites from the electronics and stacking them in layers. People have even put tiny arrays of lenses on the front of sensors to focus light on the active areas, though that can cause problems with lenses that fire light at the sensor at anything other than a right angle. By far the most popular approach, for all sorts of reasons, is to reduce the saturation of the filters which allow the sensor to see in color, compromising color performance for sensitivity.

Zero Sum Choice

The best of those ideas are the fundamental advances, the advanced development, and they can give us real advancement. They come very slowly, though. Yes, we can pay more money for more performance, tolerating a lower yield and more rejects in sensor manufacturing for a design that really pushes the envelope, but as with anything, the last 5% of the performance costs the last 50% of the money. Real progress in sensor design comes much less frequently than the camera market needs it to, so we trade off size, resolution, color performance – and when we’re selecting a camera for a job, we dance around the zero-sum game.

Broadcast Bridge Survey

You might also like...

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Standards: Part 4 - Standards For Media Container Files

This article describes the various codecs in common use and their symbiotic relationship to the media container files which are essential when it comes to packaging the resulting content for storage or delivery.

Standards: Appendix E - File Extensions Vs. Container Formats

This list of file container formats and their extensions is not exhaustive but it does describe the important ones whose standards are in everyday use in a broadcasting environment.

System Showcase: Delivering 175 Camera Production For The Repco Supercars ‘Bathurst 1000’

The Bathurst 1000 is a massive production by anybody’s standards, with 175 cameras, 10 OB’s, 250 crew and 31 miles of fiber cable. Here is how the team at Gravity Media Australia pull it off.

Standards: Part 3 - Standards For Video Coding

This article gives an overview of the various codec specifications currently in use. ISO and non-ISO standards will be covered alongside SMPTE 2110 elements to contextualize all the different video coding standard alternatives and their comparative efficiency - all of which…