Over the century or so we’ve been making moving images, a lot of improvements have been dreamed up. Some of them, like stereo 3D and high frame rate, have repeatedly suffered a lukewarm reception. Other things, like HD, and even sound and color, enjoyed more or less universal acclaim.
This article was first published in 2019. It and the entire 'HDR Series' have been immensely popular, so we are re-publishing it for those who missed it first time around.
Really good high dynamic range images belong very much in the latter group: as much as HD was better than standard definition, HDR is almost universally liked. Still, understanding exactly how HDR works - how it’s shot, graded and shown to an audience – requires a bit more than simply realising it’s pretty.
The instinctive reaction is that it’s brighter. To put a number on it, display brightness is measured in candela per square metre, which is for some reason abbreviated “nits.” Standard television displays should, according to the ITU’s Recommendations BT.709 and BT.1886, aim to display whites at a little more than a hundred nits and cinema screens around half that. Most consumer TVs are brighter, up to 250 nits or so, because the standards assume a viewing environment darker than most people’s lounges.
Even the poorest HDR displays achieve at least 600 nits. That’s enough to make conventional displays look faulty by comparison, though some commentators consider anything under 1000 nits to be underpowered, and 4000 nit displays exist. On one hand, 1000 nits is ten times brighter than a standards-compliant monitor, though in camera terms it’s only three and a bit stops more, and maybe two stops brighter than common TVs.
More Than Just Brightness
HDR is not just brighter, though; it’s also darker. Even diagrams distributed by manufacturers have misrepresented the fact that as white gets brighter, black should also get darker, or at least no brighter. The inky black shadows of a good HDR picture probably make at least as much contribution to the overall impression as the high power peaks. The clue, after all, is in the name: it’s not a high brightness picture. It’s a high dynamic range picture, which it won’t be if the blacks are actually greys.
As an aside, HDR almost invariably also means wide color gamut. Most of the standards used for HDR pictures are based on color primaries according to ITU Recommendation BT.2020, and so have access to redder reds, greener greens and bluer blues if they’re required. This particularly helps with greenish-blue colors such as the deep turquoise of tropical water, which is often rather desaturated in conventional images because the primary green is too yellowish.
Diagram 1 – showing relative dynamic range between the human visual system, high dynamic range, and standard dynamic range.
Those, then, are the general goals. As to the specifics, in mid-2019 HDR is in the middle of a format war. There are three specifications in the fight, all of which define the way in which a signal level on a wire relates to the amount of light coming out of a display. For what it’s worth, that was never particularly well defined even for conventional pictures until Rec. BT.1886 in 2011 (seriously - 2011.) Before that, the world relied upon the behavior of a CRT monitors in general as a de facto standard.
That, then is the general goal. More specifically, the most popular HDR standards are HLG, HDR10 and Dolby Vision.
HLG is an initialism for hybrid log-gamma, an approach developed by the BBC and NHK to be compatible with both conventional and HDR displays. It would take pages to describe the fine detail of how these standards encode brightness, but HLG takes a conventional approach to the dimmest parts of the image, while using a logarithmic encoding, similar to that used by modern cameras, for the brighter parts. This is a compromise, but keeps things simple: one signal looks reasonable on both types of display.
HDR10 uses perceptual quantization (PQ), a representation of brightness designed match the behaviour of the human eye. It’s arguably more capable, but sacrifices backward-compatibility with old displays. The derivative HDR10+ works the same way, but includes metadata about the intended brightness level. The intention here is to allow “a wider range of displays” to create a “reasonable” image based on the same signal. Finally, Dolby’s Vision appears to do much the same thing: it includes metadata to allow displays of different characteristics to produce the best possible image.
To indulge in a moment of cynicism, if it sounds as if HDR10+ is an attempt to mimic Dolby’s work at lower cost, that might be a reasonable assumption. Another reasonable assumption is that the variable processing in HDR10+ and Vision are intended to help hide the sins of – well – more economic TVs. That’s probably also true, but it’s perhaps more understandable, because display technology is currently the greatest concern in HDR.
There is currently only one display technology which is capable of displaying HDR material with near-ideal characteristics: the giant LED video walls used for outdoor advertising. Switch the LED off, and the display is quite literally black. Switch it on, and the display can be dazzlingly bright; they work happily outdoors in full summer sunshine. 4K displays of this type have been installed in screening rooms, but they are ruinously expensive and clearly no solution for smaller screens.
The best smaller option is OLED, where each display pixel is made up of an organic LED. As with the video walls, black levels are microscopically low, but OLEDs can struggle for brightness. The best have achieved a solid 1500 nits, but often not across the entire display simultaneously. The technology is being pushed hard to achieve even that, and can age rather quickly for an asset worth tens of thousands. The only manufacturer of really bright OLED panels has effectively abandoned the technology, and replaced its high-end display with one using LCD.
LCD can achieve high brightness over the entire display simply with a brighter backlight. Cooling often means that LCD HDR displays are comparatively bulky and rely on noisy fans, but the big problem is black level. LCD pixels must absorb the light they do not pass, and there are limits to the maximum opacity of a pixel. One solution is to build the backlight from a large number of LED emitters, so that areas of high brightness can be backlit brightly, while areas of dense shadow are backlit more dimly. It works, but it’s not ideal. The LED emitter array is invariably of much lower resolution than the LCD panel itself, and bright points of light against a black background – stars, perhaps – may end up surrounded by a ghostly grey box.
The better solution is often called dual-layer LCD, where a conventional color LCD is backed by a monochrome LCD. Set both layers to black, and the result is a very dense filter; good dual-layer LCDs can produce a black level almost indistinguishable from a high-power OLED. At the time of writing, prices were comparable with the best OLEDs, but the performance can be excellent. In the meantime, zoned-backlight LCDs are available in a variety of formats, including on-camera displays which may be useful simply as a high-brightness option outdoors in the sun.
Manipulate for Home TV
Happily, most reasonable cinema cameras are already reasonable HDR cameras. Shooting specifically for HDR robs the cinematographer of some of the room for exposure error that a modern cinema camera provides, but it’s even been possible to re-grade material for HDR that was never shot with that end in mind. The traditional approach to digital cinematography has been to shoot a very high dynamic range original, with the ability to do that being a key technical specification of a camera. The job of a colorist is to then manipulate that original to look good in the much reduced brightness range that’s achievable in a conventional home TV.
In HDR, that task may even be easier, since the reduction in range between camera and display is smaller. There are certainly a few things to be aware of: large, bright areas of frame – a blown-out sky, perhaps – may become overpowering, and abrupt cuts from dark to bright scenes may be more jarring than before. Deliberately making something clip to white can require more overexposure than before. Still, all of these things can also be addressed in the grade, to at least some degree.
If it’s tricky to create excellent HDR displays for the professional market, though, imagine the concerns over building cost-sensitive consumer TVs. There are some good options: the large OLEDs sold into this market have reasonable performance and are sometimes used as client monitors in grading suites, although they achieve their peak brightness by using red, green, blue and white-emitting subpixels; the white pixel is used to boost peak brightness, causing desaturation of bright features. Some low-cost TVs are barely better than a computer monitor, and suffer rather high black (or perhaps pale-grey) levels which can call in to doubt how high a dynamic range they really offer.
In mid-2019, HDR is very much in flux. Developments in display technology may change things quickly. Certainly, consumer OLEDs represent a massive advance on any previous domestic display, and OTT broadcasters can feed them with pictures that look, according to people who’ve side-by-sided them, closer than ever to what came out of the grade. That’s good, and as we said at the beginning, there’s no denying HDR is pretty. The question is precisely what sort of pretty it will one day turn out to be.
You might also like...
Most people are aware that any color can be mixed from red, green and blue light, and we make color pictures out of red, green and blue images. The relationship between modern color imaging and the human visual system was…
Almost since photography has existed, people have pursued ways of modifying the picture after it’s been shot. The “dodge” and “burn” tools in Photoshop are widely understood as ways to make things brighter or darker, but it’s probably less widely…
Dealing with brightness in camera systems sounds simple. Increase the light going into the lens; increase the signal level coming out of the camera, and in turn increase the amount of light coming out of the display. In reality, it’s…
Virtual production based around LED walls involves a disparate collection of technologies, and the people best placed to get the best out of the technology are often multi-disciplinarians with experience across several fields.
Innovative technologies have enabled remote production to take center stage. Although live video capture remote from the studio has been happening for years, COVID-19 has forced this trend to evolve. Today, everything from filming content to directing to editing can…