Broadcast For IT - Part 8 - Color Representation

In this series of articles, we will explain broadcasting for IT engineers. Television is an illusion, there are no moving pictures and todays broadcast formats are heavily dependent on decisions engineers made in the 1930’s and 1940’s, and in this article, we look at how color is represented.

In Part 7 – Color and Temperature, we looked at the human visual system (HVS) and in Part 3 – Video Lines, we looked at the distinct types of receptors in the human eye.

In 1913, the International Commission on Illumination (CIE) was founded to provide a forum to promote the exchange of ideas and information to set standards for all things relating to light.

Color Matching Using Red, Green and Blue

Experiments conducted by William David Wright and John Guild in the 1920’s helped us understand how color is interpreted by the HVS. Wright and Guild found that by shining three lamps of red, green and blue onto a screen, an observer could vary their intensity to match the color of a test color on an adjacent screen.

These tests were used to further understand the concept of color temperature and how we perceive color. And later, Wright and Guild demonstrated we could represent most colors using a mixture of red, green and blue colors.

Diagram 1 – Results of the Wright-Guild experiments show the distinctive red, green and blue peaks of the human visual system.

Diagram 1 – Results of the Wright-Guild experiments show the distinctive red, green and blue peaks of the human visual system.

Screens used in broadcasting generate light and use a system of additive mixing of the primary colors. But printing is the complete opposite and uses secondary colors as light is reflected off paper. Primary colors are red, green and blue, where as secondary colors are yellow, magenta and cyan.

Printing Uses CMYK System

A yellow pigment on paper is absorbing blue light, but reflecting red and green, and magenta is absorbing green but reflecting red and blue. However, in television, to create yellow, blue and red phosphors are energized to make the light.

Quite often, hints to this color system can be found in magazines, somewhere on the inside spine or edge of the back page you will see small squares of the colors yellow, magenta, cyan and black, or the letters CMYK for each of the colors – K is used to represent black so as not to mix the “b” with blue, and each one of these is added to the page during the printing process so the printer knows all the colors have been printed correctly.

Screens Use RGB System

Computer screens and television monitors work in an analogous manner, both use small dots of red, green and blue light sources to provide the appropriate mix for each individual pixel. In traditional cathode ray tube (CRT) devices, the light sources were phosphors that emitted a specific color when energized with the electron beam from inside the CRT. The brightness of each of the red, green and blue phosphors was proportional to the number of electrons hitting it from the three electron guns.

CRT’s using the flying electron beam system were dominant for many years right up until the late 1980’s. During the 1990’s, plasma screens soon started to appear in homes and used the X-Y matrix system to energize the red, green and blue plasma elements. An array of X and Y wires was placed behind the screen and the junction at each of the X and Y wires uniquely represented a pixel, so their brightness was individually set. And each pixel was further split into a red, green and blue plasma source so the correct color could be represented.

Backwards Compatibility

Liquid Crystal Display (LCD) screens soon followed behind plasma displays. Although their method of generating the three primary colors is different, the X-Y matrix is the same. A fluorescent white light source sits behind the screen and LCD elements are varied between on and off, so they allow light to pass through at a color of red green and blue.

Although we now use the X-Y matrix system to activate each of the RGB elements at each pixel, the system still relies on and is backwards compatible to the old line, frame and field system developed in the 1930’s.

Diagram 2 – mixtures of red, green and blue primary colors provide secondary colors of yellow, magenta and cyan.

Diagram 2 – mixtures of red, green and blue primary colors provide secondary colors of yellow, magenta and cyan.

Color cameras work in the opposite way. Using a lens, light is focused onto sensors inside the camera. Each sensor detects either red, green or blue light, and provides the RGB source required in broadcast television systems.

Originally, color cameras used cathode ray tubes like the operation of a display CRT, but much smaller and working in reverse. The current scanning electron beam was determined by how much light fell on the face of the tube sensor, this gave a voltage which was proportional to light intensity for each color.

Save Bandwidth

Consequently, a color television picture consists of three signals, one each for red, green and blue. But transmitting three signals is wasteful of radio frequency bandwidth and very expensive on cabling. If the cables are not identical in length or the signals do not have similar delay, then the red, green and blue images can be seen to separate, resulting in a registration problem.

To maintain the highest possible quality in analog stations, some areas of the broadcast facilities distributed the television picture as individual signals of red, green and blue. However, analog distribution has now been superseded by digital distribution using SDI (serial digital interface), but the RGB legacy is still a vital influence in the television distribution system.

YUV to the Rescue

In analog systems, RGB was generally difficult to work with due to its heavy demands on bandwidth and timing. And to maintain backwards compatibility with black-and-white television sets, a new format was created called YUV. “Y” represents the black-and-white compatible signal and the “U” and “V” represented the color difference signals. YUV will be discussed in greater depth in a later article.

Although the frame and line rates are different in PAL and NTSC, their fundamental operation is very similar, and YUV led the way for the formats. The Y signal forms the black-and-white component, and the U and V in PAL, or I and Q in NTSC forms the color part of the signal. The color components are modulated onto the Y signal using the color subcarrier system as discussed in article Broadcast For IT Part 4 and Broadcast For IT Part 5.

PAL and NTSC Are Not Interchangeable

As the modulated frequency was so high, the black-and-white televisions ignored the color coding and displayed the picture correctly. New color televisions of the time decoded the modulated color difference signals and displayed the color correctly. However, the systems were not interchangeable, and NTSC would not work with PAL or vice versa.

Decisions on how we represent color in television broadcasting, and computing in general, were made back in the 1960’s. Although HD, 4K and 8K systems no longer use NTSC and PAL coding, they are still based on the RGB system, and are heavily influenced by NTSC and PAL, and they only really differ by how many bits we use to represent each of the RGB signals and how we distribute them.

You might also like...

KVM & Multiviewer Systems At NAB 2024

We take a look at what to expect in the world of KVM & Multiviewer systems at the 2024 NAB Show. Expect plenty of innovation in KVM over IP and systems that facilitate remote production, distributed teams and cloud integration.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

The Big Guide To OTT: Part 9 - Quality Of Experience (QoE)

Part 9 of The Big Guide To OTT features a pair of in-depth articles which discuss how a data driven understanding of the consumer experience is vital and how poor quality streaming loses viewers.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…