HDR: Part 4 - Surviving Modern Colorimetry
Most people are aware that any color can be mixed from red, green and blue light, and we make color pictures out of red, green and blue images. The relationship between modern color imaging and the human visual system was recently discussed by John Watkinson in his series on color. In this piece, we’re going to look at something that comes up often in modern film and TV technique: color gamuts. It’s a term that suffers a lot of misuse, but the basics are simple: a color image uses red, green and blue, and the gamut describes which red, which green, and which blue we’re using.
This article was first published in 2019. It and the entire 'HDR Series' have been immensely popular, so we are re-publishing it for those who missed it first time around.
It’s pretty instinctive why this is important. If we want a deep, powerful emerald green, for instance, we might turn on the green light, and turn off the red and blue lights. If we want to make the green into a lime green, moving toward yellow, we might start to add red. If we want to make it more turquoise, a blue-green, we might add blue. If we want to make the color paler, less saturated, we might add both. Regardless, the greenest green we can achieve, or the most colorful version of any color, is defined by the colors the display emits, and that’s built into it at manufacture.
For most HD video, the red, green and blue primary colors are defined (along with many other things) in the famous ITU Recommendation BT.709. It’s useful to talk about 709 because the color gamut has an interesting problem: the green is not very good. The blue is reasonably deep and powerful but 709 green is a rather feeble grassy color that lacks punch. This affects not only greens, but also deep cyans or yellows because both those colors require combinations including green. The red is perhaps slightly orange, making really powerful ruby hues inaccessible.
Actually, describing a color is tricky. Normally, on a computer, we might specify a color as proportions of red, green and blue. For instance, the combination [1, 0.25, 0.5] would create a pale magenta, including full red, one quarter green, and half blue. The problem is, that only has any meaning when we know what red, what green and what blue we’re using in the first place. To specify colors absolutely, in the real world, it’s common to use the CIE 1931 chromaticity diagram. The reasons a CIE 1931 diagram looks the way it does are a bit outside the scope of this piece, but what’s good is that if we pick a point on that chart, we’ve picked a color that’s fixed and unchanging in the real world.
Fig 1 - The CIE 1931 color chart is a de facto standard for specifying an absolute, real-world color.
That’s how the standards do it, by including an X-Y coordinate on a CIE diagram. One thing about this is that if we plot those three points on the diagram and draw a triangle between them, that triangle will include all of the colors the color gamut can represent. If we draw that triangle for the Rec. 709 color gamut, the problem becomes very clear. The horseshoe shape of the colored area on the chart represents all the colors a human eye can see – within the limits of your monitor to display it, of course. There are clearly a lot of colors that humans can see, but that Rec. 709 can’t, particularly in the cyan and turquoise. It isn’t quite as bad as it looks because, for strange reasons, the CIE chart isn’t quite ideal, but it’s not great.
Fig 2 - By drawing three dots at the locations of the red, green and blue primaries of any color gamut, we can indicate the achievable colors as a triangle.
As it happens, most images don’t include much very saturated color, which is why 709 pictures are acceptable to most people. Things which do create saturated colors include disco lights, LED strobes on emergency vehicles and orange sodium-vapor street lights, but that’s rare. In nature, high saturation is sometimes found on the wings of butterflies, the petals of certain flowers, and the blue tropical sea off a white sand beach. Try to shoot that beach on a camera built for Rec. 709 displays, and the sea might simply look a dull blue rather than a vibrant turquoise.
The solution, of course, is deeper primary colors, and the problem with doing that has historically been engineering. The feeble green of Rec. 709 was chosen because it matched the green glow of phosphors used in cathode ray tubes. It was chosen because it was practical, not because it was ideal. We’ve been making TVs and computer monitors which use that green ever since, but it’s much easier to control the colors used in TFT displays because they’re simply color filters, as opposed to an esoteric vacuum tube technology.
It’s a similar story with projected displays, since projectors filter light to create color images. Things become a little more complex with OLED displays, which are another frankly rather esoteric technology which relies on special formulations of materials and techniques to create certain colors, though most manufacturers easily meet or exceed 709 and more is possible. Perhaps the most capable displays are actually the giant LED video walls often used as advertising hoardings, since standard LEDs can create a huge range of very saturated color, including a very deep and powerful emerald green.
Fig 3 - LED emitters are capable of huge feats of brightness and contrast and offer very saturated colors for a very capable system.
Even so, it’s pretty clear that we can’t create a conventional color gamut that can handle all of the colors visible to the human eye. Conventional gamuts have three primaries, which form a triangle. The space on the CIE diagram is not a triangle. Human eyes do have three kinds of color-sensitive elements which have some very broad relationship to red, green and blue, but only very broadly, and the chart’s shape is determined by the soft-edged, biological behavior of the average eye. A three-primary system can never cover it all.
More primaries have been used, but most practical systems use three, mainly because of engineering practicality but also because some small amount of missing color coverage isn’t a huge problem. Ultra-saturated colors are rare in reality, and we don’t miss them when they aren’t there. One alternative color gamut that’s been proposed for future TV, and particularly for high dynamic range displays, is defined in ITU Recommendation BT.2020. The 2020 gamut uses primary colors which are precisely on the edge of visible colors in the CIE diagram for maximum coverage.
Fig 4 – The Rec. 2020 gamut (black) uses monochromatic primaries at the edge of the CIE diagram, enclosing more colors than the (white) 709 triangle.
This has the interesting effect of implying that these colors are completely monochromatic, that is, they contain just one color of light and can be specified as a point on a spectrum – a wavelength - as well as an X-Y coordinate on a CIE diagram. The intent of 2020, of course, is to create a gamut which covers a huge range of visible colors, and it certainly does that. There are problems in that it is difficult to make completely monochromatic light; in fact, as a matter of theoretical physics, it is impossible, but it is possible to get very close and the resulting color system is very capable.
2020, however, still doesn’t cover the entire human visual range. It is sort of possible to do so, but only using a bit of mathematical trickery that doesn’t let us build monitors. If we define three primaries which are way outside the horseshoe-shaped visible area on a CIE 1931 diagram, we can create a triangle which covers everything. That’s fine, but there’s a problem: what color is something that’s outside the visible part of a CIE diagram? Well, it’s not a real color at all, it’s a mathematical construct. About the best way to describe the non-real blue primary, for instance, would be as a magical blue light that somehow subtracts red and green light from anything it hits. Since there’s no such thing as negative light, it’s a theoretical idea only.
Fig 5 - ACES (red line) uses imaginary colors to create a gamut that encloses all visible colors, but as a result will always require processing before viewing.
We’re now discussing color gamuts as a way to store color in computer files, not a way to make a monitor, because we can’t actually make light of an imaginary color – but it does result in a file that can be processed to create images that can be displayed on a monitor that uses literally any other real-world primary colors. This is how the Academy Color Encoding System, ACES, works. There are downsides: the color gamut is so huge that files take up a lot of space, and lots of that space is wasted providing the option to represent colors that can never practically exist. Still, it’s beloved of post-production and visual effects people because it is so capable.
If all this sounds like there are rather too many options available, and that a lot of complexity could arise from the situation – well – that’s true. Camera manufacturers frequently specify at least one-color gamut specifically designed to allow that cameras’ pictures to be stored most effectively. Some camera manufacturers offer more than one. Post-production software and monitoring equipment generally needs to be told what to expect, or colors will be distorted, sometimes in a rather subtle way that doesn’t immediately look wrong. It’s not too hard to get right – figure out what gamut the images were recorded in and set everything to expect that – but it needs to be got right.
In general, though, the fact that we’re even having this discussion is encouraging. Improvements to color in film and television are long overdue – as we’ve seen, we’re still largely working to standards defined by the practicalities of cathode ray tube displays. It’s not clear whether things will settle down quickly, or at all, and until then we’re more or less required to deal with the complexity.
You might also like...
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.
HDR Picture Fundamentals: Camera Technology
Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
NDI For Broadcast: Part 3 – Bridging The Gap
This third and for now, final part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to a trio of tools released with NDI 5.0 which are all aimed at facilitating remote and collaborative workflows; NDI Audio,…
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…