Recent camera technology and the resulting image quality improvements are nothing short of amazing.
Relaxing in the evening, if there is nothing interesting on the internet I turn to television. When there is no new programming that appeals, I may watch reruns, scripted shows, both drama and episodic. Over the last few years the new productions have such a different look from earlier material, and I got to wondering what are the key technologies that have enabled this change to a more realistic look? I came to the conclusion that the key technologies are the CMOS imager and compression algorithms. The former set the standard of the capture image, the latter allows it to be delivered to the viewer.
The re-runs fall into a number of categories. The oldest are shot on film or analog cameras and video tape. Moving forward in time, there are standard def digital cameras recorded to Digital Betacam. Finally, they cameras reach the HD age.
Although there has been a gradual increase in picture quality, it has only been in the last decade that advances in sensor technology have given DPs cameras that can shoot at pretty much any light level.
What led to a step change in expectations of digital camera image quality was the development of the Red One, which shipped in 2007. ARRI followed with the Alexa in 2010, plus there are many fine cameras from Sony, Panasonic, and Canon.
What this new generation of cameras had was wide dynamic range, good sensitivity and excellent color science including good rendition of skin tones. The performance was such that the majority of directors could finally feel the time had come to migrate for film to digital.
Using the full capabilities of these new high performing cameras meant that DPs could develop a more natural, realistic looking lighting style, which can be seen especially in historical drama. Kubrick jumped through hoops to film candle-lit scenes in Barry Lyndon back in 1973, but today it would be easy.
One of the things I notice most about the reruns from the 1980s and ‘90s is the excessive fill lighting, used to lower the contrast range of the scene to the capabilities of the camera. Powerful key lights often don’t match the skylight, a problem on dull days trying to get enough light through the lens. The higher sensitivity and wide dynamic range of modern cameras mean that lighting can be subtler. Light is there to illuminate not dominate.
Back then it was a skill arriving at a realistic gamut of colors. Contrast today, where the director has a choice of a ‘looks’, from the realistic to the highly stylized.
These advances in sensors have not just benefitted digital cinematography. The 3-chip studio and field cameras have also seen performance improvements in sensitivity and dynamic range alongside the increase in resolution. The modern system camera with high frame rate and high dynamic range is key to premiere sports programming on the new UHD and HDR channels.
A UHD/4K camera outputs around 12Gb/s. Clearly that can’t travel far beyond the switcher or edit bay, and compression, H.264 and HEVC allow a reasonable rendition of the images to be delivered to the consumer at practical bit rates. Compression quality is of course still dependent on the bit rate allocated to a channel, but the OTT 4K services have raised the bar for picture quality in the home (setting Blu-ray discs aside).
Interlace is proving tenacious as a way of reducing data rates, a method that has been in use for a century since the dawn of television. UHD sees its replacement with progressive scan and the end of the attendant artifacts and reduced vertical resolution.
Now this is not to belittle the processing between the camera and distribution codecs, but the developments have been a matter of increasing the bit rate: 270, 1500, 3000 Mb/s and now 12Gb/s. Non-trivial, but I like to think that the image sensors are the key. With the sensor limiting the system, it is the defining factor in the quality of the video system. The same argument also applies to the display, UHD could not have been possible with the cathode ray tube as the display. It required the development of the LCD and OLED panel to make both HD and UHD possible and viable. And compression, why is that key? Well how else could you deliver those data rates over affordable pipes?