It’s all very well reading all this theory about colorimetry, but what can be done in practice? First of all, it is necessary to consider that imaging, be it still or moving, is a creative process that relies totally on technology. Wherever that happens, there are always two different ways of thinking about choices: objective and subjective.
At a given time of day in a given location under given weather conditions, the daylight will have a certain color temperature. Being objective, it is what it is and it’s not debatable. If accurately color balanced images are to result, the temperature of the daylight has to be taken into account. It is possible to say whether the images are correctly color balanced or not.
On the other hand, if color correction is being used to achieve some creative or subjective effect, such as making pictures look warmer, the concept of accuracy and correctness goes away as there is a different criterion for success, which is the subjective effect someone wanted. Others may not share the feeling of success.
Before making any colorimetric decision, it is important to determine which of those two possibilities is the case. In most cases, it is best to capture the images reasonable accurately and restrict the use of strong effects to post production, where they can be tried out and removed if not successful.
Film and video differ from photography in the requirement for editing. Whilst color constancy works well in real life, it doesn’t work over the timescale of an edit and if two clips having different color balance are edited together the resulting color jump will be visible to all.
When shooting under natural light originating from the sun, color temperature changes due to the filtering action of the atmosphere. The spectrum remains smooth and simply tilts one way or the other. Fig.1 shows the color temperature that can be expected for various conditions and it’s worth remembering it. It is also possible to measure color temperature using a chroma meter. This looks like a photographer’s spot meter but it contains an RGB sensor and can display the relative levels of the three primaries.
Fig.1 - This chart shows the color temperature of daylight that will be found under various conditions. It is worth remembering this as it speeds up white balancing if the best lens filter is selected.
In the case of electronic cameras, the easiest way of straightening out the tilt is to modify the relative sensitivities of each of the color channels. This can be done using optical filters prior to the sensors, by adjusting the relative electronic gain of the channels in the camera or a combination of both. It can also be done in post production. Where a video camera is live to air, color balance has to be immediate, whereas shooting something that will pass through a production system allows more options.
Most professional video cameras include a filter wheel that allows common spectral corrections to be selected, whereas in photographic cameras the filters are physically installed on the front of the lens.
The filter to be used, if any, can be selected by reference to Fig.1. The appropriate filter will improve the chances of the camera being able to achieve a white balance electronically.
Fig.2 shows the spectral response of a couple of Wratten filters commonly used for color correction. The 81A appears pale orange and it favors longer wavelengths, lowering the effective color temperature of the illumination. Remember the psychological effect is the opposite of the physics. Lowering the color temperature makes the image look warmer. The 82A filter appears pale blue and shifts the color temperature of the illuminant up, making the image appear colder.
Fig.2 - Two spectral response curves of color correction filters are shown here.
At a), an 81A filter that warms the image by cutting short wavelengths. At b), an 82A filter cuts long wavelengths and has the opposite effect.
Since filters are passive, all they can do is remove unwanted parts of the spectrum so that there appears to be more of the wanted part. This means that such filters cause a loss of exposure, which is often automatically compensated as the exposure is determined from the filtered light.
Many digital still cameras offer the option electronically to change the color balance of the camera, whereas a lot of photographers simply balance after the event by processing the pixel values. Some photographers still use Wratten filters even on digital cameras. On a true SLR where the viewfinder looks through the lens, the effect of the filter can be seen in the viewfinder. On a camera where the viewfinder is some sort of electronic screen, the colorimetry of the screen will often be so bad that no colorimetric decisions should be made from it.
There is an argument that a Wratten filter on a digital camera improves the noise performance because it doesn’t require the gain of the lowest channel to be increased. However, a look at Fig.2 shows that the 81A filter has a maximum transmission of 90 percent at the red end, so the red channel would still need increased gain even if a filter was used. The argument is not very strong and the best argument for filters on digital cameras is that the photographer with a film background can work in the usual way.
So much for natural light, which is the easy part. There are some other possibilities that are somewhat harder to deal with. One of these is artificial light. In the good old days, there was tungsten light, and because it was black body radiation, it had a color temperature and it could be measured and compensated.
Today the tungsten light is a dinosaur and the range of artificial light types is huge. None of the modern lighting technologies rely on black body radiation because it wastes energy. In many cases more energy is wasted when the air conditioning takes away the waste heat.
Not being black body radiators, these lights don’t have a color temperature. Metamerism means that a light that appears white to the human eye can have a spectrum that looks like a punk crossed with a cactus and the camera filters don’t know what to do with it. Professional cameras can adjust, but even if a correct white can be obtained, this may result in saturated colors being wrong. The iPhone user takes pot-luck.
Some types of light are based on RGB triads and can independently control the primaries so the lighting can be tuned. In an act that I regularly light, I found that the lighting that appeared white was causing iPhone pictures to have a slight magenta cast. A small reduction in the R and B drives took care of that and audiences are delighted because they can take some nice shots.
Fig.3 - Color correction requires the original format to be converted to linear-light co-sited RGB before correction. This needs interpolation and re-matrixing, which must be reversed after the correction.
Non-incandescent lights intended for studio use can usually be balanced fairly well, but for the news gatherer, who has to shoot wherever and whenever, the lighting could be colorimetrically challenged to the extent that a white balance may not work, or if it does work the results look peculiar.
Telecine is simply not viable without color correction. Color film works by subtraction and is limited by optical characteristics of the film chemistry. Video works by addition and is limited by a different set of characteristics. Put those two in series and the chances of colors coming out right are remote.
Color correction is simply a matter of rebalancing the proportions of R G and B and has to be carried out in that format. To be precise, at a given point on the screen, all three components must be available. Fig.3 shows that if starting with images in a color difference format, such as 4:2:2 or 4:2:0, the color information will not be co-sited with luma. In 4:2:2 alternate pixels have no color information at all and horizontal color difference interpolation is needed so that every pixel is in the Y,Cr,Cb format. In 4:2:0 the interpolation has to be in two dimensions.
Once co-sited Y,Cr,Cb is available, it can be matrixed to RGB and the color correction can be performed. The ideal would be to remove the gamma correction and to work in linear light, but this is not always done.
Changing the color temperature is based entirely on black body physics. At any temperature the spectrum can be calculated and the difference between spectra at any two temperatures can also be calculated to produce a correction filter. The filter curves of Fig.2 are examples of that. As the primary wavelengths in any video format are known, if they are drawn on the correction filter, the amounts by which R, G and B need to be adjusted can be worked out to change the color balance. Once new values of R, G and B are computed, the images can be re-matrixed to a color difference format and re-interpolated and decimated to site the color samples correctly.
In telecine work the correction needed will be a format conversion and will not necessarily correspond to a change in color temperature. It may be that when a color balance is obtained on neutral or flesh tones a saturated color looks wrong. Some color correctors have chroma keying ability so that instead of applying global correction they can target a specific color and change it.
You might also like...
With the emergence of the cloud into the media production and delivery space, the broadcast and media industry must embrace an entirely new approach to acquiring and deploying technology. Large capital expenditures (CapEx) are increasingly being replaced by operating expense …
As the media landscape continues to streamline the way it delivers content, cloud-native technology, that is, container-based virtualized environments that replicate traditional workflows on premise, is playing a big role. However, some broadcasters moving their assets and processing power to…
ABR delivery offers the prospect of automatic adaptation of bit rates to the prevailing network conditions. Since the client is aware of the set of available bit rates, it can determine, segment-by-segment, which is the optimal bit rate to use.
The IEEE has just published the latest version of its Precision Time Protocol (PTP) standard that provides precise synchronization of clocks in packet-based networked systems. This article explains the significance of IEEE 1588-2019, otherwise known as PTPv2.1, and how it…
Here we look at some of the origins of gamma in imaging and move on to introduce the peculiar characteristics of the cathode ray tube.