Beyond RGB – Should We Use More Primaries?

Moving beyond the use of three primary colors could significantly increase the range of colors we can reproduce. There is no doubt it could improve the viewer experience but what are the barriers to adoption?

For decades, we’ve been used to mixing color from three primaries. Printers (and film stocks) do it with cyan, magenta and yellow. Video display systems use red, green and blue, or sometimes brightness alongside signals representing red-cyan and blue-yellow color ranges. Either way, it’s a system with three components, because we have eyes with (usually) three types of color-sensitive cells. We’ve got rather good at it: assuming we can control those components over 256 (or, better, 1024) levels, we can create pictures that feel like they represent reality pretty well.

There have always been variations on a theme, particularly in terms of which shades of red, green and blue we choose. Use dull, desaturated shades and we can’t achieve such saturated colors in the resulting image. The greenest green can’t possibly be any greener than what we get if we turn the green channel all the way on, and the red and blue channels all the way off. Recently, though, we’ve seen designs suggesting more than three primaries. It often appears in LED lighting, and now in experiments by an organization called 6P Color, which proposes video displays with up to six primary colors. We might call all of them multi-primary systems and they are often developed for different purposes with different goals.

Evaluating those different systems means finding a way to describe and compare them. Normally we do that by specifying a colorspace, which means describing what the primary colors are, and defining what we think white should look like when red, green and blue are all at maximum. That’s part of standards like the ITU’s Recommendation BT.709, or the more recent BT.2020, which specifies deeper, more saturated primaries so as to be able to handle a wider range of color. Other commonly-encountered colorspaces include sRGB and Adobe RGB, sometimes used on workstations, and the Digital Cinema Initiative’s P3 colorspace used for cinema exhibition.

The CIE 1931 human perception diagram with REC.709 and REC.2020.

The CIE 1931 human perception diagram with REC.709 and REC.2020.

Normally we’d describe colorspaces about this using the famous CIE 1931 diagram, which represent all of the colors detectable to the human eye on its D-shaped plot. The chart is designed rather like a hue-and-saturation color wheel, distorted by the biological nature of human eyes and brains. Points inside the D-shaped plot on the chart represents a color as perceived by a human being, with white at the center and saturation increasing toward the edge. Crucially, not all light sources which appear at a given point on the chart have identical output. A light that looks yellow might emit only yellow light; it might also emit both red and green light simultaneously. The result might still look the same shade of yellow, and appear at the same point on a CIE chart.

One of the characteristics of a CIE chart is that, given two light sources of different colors, we can create any color on a straight line between them by mixing them together. In the same way, given three light sources of different colors, we can create any color in the triangular area they cover by mixing them together. That’s how color video pictures work, and that’s how we can compare colorspaces, but there’s a problem: the area of colors visible to humans is not a triangle.

That means no colorspace using three real primary colors can represent every color visible to humans. It’s possible to define a mathematical colorspace which represents color using three locations outside the lozenge of visible colors. That’s what the Academy’s ACES does, so it can represent any possible color. Unfortunately, it’s impossible to build a video display or lighting device using that approach, because colors outside the D-shaped shaded area aren’t things which can exist in reality; nothing can be more saturated than a single wavelength of light; a red light cannot actively delete green and blue light from the world.

A video system designed to reproduce a more-complete range of the colors visible to humans may therefore require more than three primaries. The 6P Color initiative is a project to develop video systems which include, as the name suggests, up to six primaries. Typically this includes red, green and blue, but also (roughly) amber, cyan and lime-green primary colors. The result is a hexagonal shape on a CIE chart, representing a system which can reproduce a much larger range of colors than a traditional three-primary system.

6P proposes the addition of Cyan, Lime-Green & Amber to RGB.

6P proposes the addition of Cyan, Lime-Green & Amber to RGB.

6P’s most recent demonstration is based around a device incorporating just four primaries, including a red, green, blue, and cyan. It’s a single, comparatively low resolution LED panel broadly similar to the types used in video walls, presumably because that’s a much less expensive way to demonstrate the idea than a custom-made LCD or OLED panel and its associated drive electronics. The demo only uses four primaries, adding a cyan-hued emitter to red, green and blue.

The choice of cyan makes sense because colorspaces such as Rec. 709, with which most people are familiar, infamously lack the ability to represent deep teal and turquoise colors such as the hue of tropical seawater over white coral sand. The CIE 1931 chart is deeply curved on its left edge, such that any straight line between the blue and green primaries will inevitably cut off much of the blue-green region. Add a cyan primary and that straight line can now have a single corner in the middle, increasing the range of blue-greens available.

4P adds only Cyan to RGB.

4P adds only Cyan to RGB.

What does it look like? Well, it’s hard to describe the sensation of perceiving color in text, and impossible to recreate the behavior of 6P Color’s multi-primary display using the conventional, three-primary device you’re probably using to read this. Perhaps the best comparison is to think of the improved brightness and (particularly) contrast of HDR, which many people have seen. Adding primary colors isn’t the same thing at all - it doesn’t affect brightness - but HDR does have the same effect of making everything that isn’t HDR look faulty.

On conventional monitors, tropical water just looks blue. On 6P’s demo, it shimmers a brilliant, deep emerald-turquoise in a way that few people will be aware they’re missing until they’ve seen it. Viewers are required to overlook the low resolution of the panel, but it demonstrates the principle. Improvements in the orange-red-yellow area of the chart, not embodied in 6P’s demo, are likely to be less spectacular, but still worth having.

The concept is sound, though it’s reasonable to ask how necessary all of this is. After all, the Rec. 709 colorspace used for HD television seems to work well, and there is no obvious audience clamor for improved color ranges in broadcast television (and the Rec. 2020 standard significantly improves on 709 in any case). HD looked clearly and obviously better than standard definition. 4K was a smaller perceived improvement, and uptake has been sluggish. Whether 6P Color is likely to be seen as a big step forward is not clear.

The big question is whether any of this is ever likely to become mainstream. In principle, a display capable of multi-primary operation could be entirely backward-compatible with old material prepared for less capable designs. It might even be more capable of properly displaying material mastered for three-primary colorspaces. Getting the best out of a 6P display, though, would require material specifically mastered to take advantage of the new ranges of color. Most (effectively all) material is currently mastered for three-primary systems, and the ability of many modern, single-chip cameras with Bayer filter masks to record the colors that 6P can represent is in question. It’s not clear whether ACES, which can represent everything, is really suitable as a distribution format; it might require more bit depth than we’re used to broadcasting.

Existing cameras are probably capable of recording rather wider color gamuts than they are often asked to do, although the situation is not as straightforward as it was with HDR. Production equipment is already capable of much more dynamic range than will be broadcast. Unless we’re producing an HDR master, much of the dynamic range of modern cameras will be reduced down to the limited range of a BT.1886 master before being broadcast. Mastering HDR (to cruelly oversimplify) means reducing things less. Conversely, it’s probably not quite true to say that every piece of production equipment has to change to support something like 6P Color, although it might require significant and simultaneous changes to cameras, post production workflows and equipment, broadcast systems and consumer devices.

The pace of change in the twenty-first century contrasts sharply with the multi-decade life of standards and home TVs through the latter half of the twentieth. The improvements we’ve enjoyed have been entirely backwards compatible, as would multi-primary color be, although the sluggish uptake of 4K and the complete disdain over high frame rate and stereo 3D suggest that the market can be somewhat discerning. Perhaps multi-primary pictures might do well in that, like HDR, they look better without being distractingly different and while maintaining backward compatibility. It’s not hard to say they’re objectively better; whether they’re sufficiently better to find a market is something only time will tell.

You might also like...

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Comms In Hybrid SDI - IP - Cloud Systems - Part 2

We continue our examination of the demands placed on hybrid, distributed comms systems and the practical requirements for connectivity, transport and functionality.

Virtual Production Technology At NAB 2024

The relentless rise of virtual production looks set to continue at the 2024 NAB Show. Expect expansion and evolution of the ecosystem of cutting-edge technologies that make so many new creative possibilities a reality.