Is Gamma Still Needed?: Part 6 - Analyzing Gamma Correction In The Frequency Domain

To date, the explanations of gamma that are seen mostly restrict themselves to the voltage or brightness domain and very little has been published about the effects of gamma in the frequency domain. This is a great pity, because analysis in the frequency domain produces interesting results.

In all types of imaging, one of the criteria is the sharpness of the reproduced image. The technical term is resolution; the ability of the system to distinguish between closely spaced details in the picture rather than running them together.

Resolution, which is absolute and has units, should not be confused with definition, which is a marketing term, is relative and meaningless. High definition is more meaningless.

Fig.1 shows a simple imaging system, consisting of a film camera that produces transparencies for projection. As the image passes along the chain, imperfections cause resolution to be lost. The resolution seen on the screen is the worst of the resolution of the two lenses and the resolution of the film due to finite grain size.

Fig.1 - A simple camera exposing transparency film for projection has three possible stages where resolution can be lost.

Fig.1 - A simple camera exposing transparency film for projection has three possible stages where resolution can be lost.

Television adds some complexity, because the image is scanned. In tube-type cameras, the equivalent of the grain size of film was the size of the scanning spot, which would be convolved with the detail in the image. Modern sensors use discrete photosites that should be comparable in size to the point spread function of the lens so that both have about the same resolution.

The scanning process in a television camera converts spatial frequencies into temporal frequencies. The frequency is the product of the scanning speed and the spatial frequency. Fig.2 shows that in a tube-type TV camera resolution can be lost in the lens, further lost in the finite size of the scanning spot and further lost if sufficient bandwidth is not made available for the video signal.

In a modern sensor, the rate at which the photosites are scanned becomes the sampling frequency, and according to sampling theory the highest signal frequency cannot exceed half the sampling frequency. For example, in 601, the luma sampling rate was 13.5MHz and that offered a video bandwidth of about 6MHz.

Fig.2 - In a traditional tube camera, the lens and the scanning spot size can impair resolution and any bandwidth limit applied to the video signal will also limit horizontal resolution.

Fig.2 - In a traditional tube camera, the lens and the scanning spot size can impair resolution and any bandwidth limit applied to the video signal will also limit horizontal resolution.

TV changed from lines to pixel count as a metric, but a metric for what? We might think that lines or pixel count put a bound on the achievable resolution. In a linear system that is true. In a system that uses gamma, it is unfortunately not true.

Fig.3 shows a test chart can be made with black and white stripes and these would create a square wave in the output of an ideal camera sensor. However, the optical system of the lens and the aperture effect of the sensor combine to make the modulation transfer function that of a low pass filter. As we move the chart further away, more of the harmonics of the square wave are lost and eventually we are left with only the fundamental. The frequency of that fundamental tells us the horizontal resolution of the camera.

This is the basis of modulation transfer function testing used extensively for lenses. It's based on Fourier analysis, which in turn is based on linear addition of sine waves of various frequencies to synthesize waveforms. In other words all of the mathematical relationships between lens and sensor resolution, waveform and spectrum and scanning speed and bandwidth only apply when all of the transfer functions concerned are linear. William Schreiber pointed that out as early as 1986. 

Fig.3 - In Modulation Transfer Function testing, and ideal test card will produce a square wave if there is infinite resolution and bandwidth in the camera. In practice as the spatial frequency increases the square wave will lose its harmonics and become a sine wave, but only in a linear system.

Fig.3 - In Modulation Transfer Function testing, and ideal test card will produce a square wave if there is infinite resolution and bandwidth in the camera. In practice as the spatial frequency increases the square wave will lose its harmonics and become a sine wave, but only in a linear system.

As is well known, gamma is a powerfully non-linear transfer function that is asymmetrical, in other words the effect at the black end of the luma gamut is quite different to the effect at the white end. This contrasts, for example, with symmetrical non-linearity such as the clipping of an over-driven audio amplifier.

When gamma correction is applied in television, none of the assumptions of linear information theory apply any more. The non-linearity of gamma introduces harmonics and increases the video bandwidth needed for a given resolution.

Fig.4a) shows the effect of putting a sine wave into a gamma corrector. The output waveform has been changed dramatically. We would expect the non-linearity would introduce third harmonics, and the asymmetry would introduce a second harmonic.

Fig.4b) shows a sine wave of twice the input frequency. The existence of a significant amount of second harmonic content is not in doubt.

Once upon a time cameras used analog signal processing between the sensor and the video output, but soon digital processing cameras were introduced because of the greater stability. The output of the sensor was digitized and then all processing could be done manipulating numbers. In addition to other steps, the gamma correction would be done in the digital domain. One would think this would be easy, requiring a look-up table, a piecewise linear approximation or some other suitable algorithm.

Unfortunately it is not easy, because of the harmonics caused by the non-linearity causes problems. Fig.5a) shows the spectrum of a sampling system where aliasing has been prevented using a band-limiting filter prior to the ADC. Fig.5b) shows what happens when gamma correction is attempted. The harmonics are generated after the anti-aliasing filter and the lower side band that reflects from the sampling rate now folds back into the baseband. 

Fig.4 - At a) the asymmetry of gamma correction crushes the positive excursions of a sine wave and sharpens the negative excursions. Note the similarity of those negative excursions with the second harmonic sine wave shown in b).

Fig.4 - At a) the asymmetry of gamma correction crushes the positive excursions of a sine wave and sharpens the negative excursions. Note the similarity of those negative excursions with the second harmonic sine wave shown in b).

As a result, digital processing cameras that perform gamma correction have to use oversampling to broaden the spectrum of the base band so that the harmonics due to the non-linearity do not fold back into the base band and can fit correctly within the available bandwidth. Once that is done, there can be a combined low-pass filter and decimator to return to the desired output sampling rate, such as 13.5MHz in SDTV.

The first reference I could find to oversampling gamma correctors is in a Sony Patent dating from 1996, which explains the bandwidth extension caused by gamma. It’s US 6,515,699 and makes interesting reading.

It is useful to consider the effect of gamma on a minimal black and white TV system. After the camera is the gamma corrector and then there is a low-pass filter that prevents the TV channel going out-of-band. The receiver has a gamma characteristic that opposes most of the gamma correction to leave a small overall gamma.

If the sine wave emerging from the limiting case of Fig.3 represents the camera output, the gamma corrector will produce the waveform of fig.4a) which contains second and third harmonics. If we have a test card that is near the limit for NTSC, the fundamental will be just below 4.28MHz. The second and third harmonic content due to gamma correction are filtered out by the band-limiting filter.

What arrives at the receiver cannot be the gamma corrected waveform of Fig.4a), because the necessary harmonics have been removed. All that can be left is the fundamental, so we are back to a sine wave again! The sine wave is reduced in level because fundamental energy went into the harmonics.

At the receiver, that sine wave is subject to the gamma of the display, so the resultant light waveform cannot be a sine wave.

We must conclude that gamma is only reversible if the gamma corrected signal is not band limited. Band limiting a gamma corrected signal causes distortion as well as high frequency loss. The widely held assumption that gamma and inverse gamma cancel out is simply not true in television where there is always a bandwidth limit either due to channel width in analog broadcasts or the anti-aliasing filters in digital systems.

Fig.5 - In a normally operating sampling system, a), the anti-aliasing filter prior to the ADC prevents the lower sideband from interfering with the base band signal. However if the samples are gamma corrected, as in b) the second harmonic created by the non-linearity is after the filter, and the extended bandwidth folds back and results in aliasing.

Fig.5 - In a normally operating sampling system, a), the anti-aliasing filter prior to the ADC prevents the lower sideband from interfering with the base band signal. However if the samples are gamma corrected, as in b) the second harmonic created by the non-linearity is after the filter, and the extended bandwidth folds back and results in aliasing.

One common test applied to a video channel is the multi-burst signal, which consists of bursts of various frequencies that are produced in a generator and used to test the response of a video transmission channel. If the frequency response isn't flat, one or more of the bursts will fall in level.

However, it is important to realize that the multi-burst signal consists of sine waves that have been gated to make the bursts. They have not been gamma corrected; they remain linear. So if you think that the multi-burst signal is measuring the resolution of your video system, you are wrong. It is measuring the frequency response of the video signal path, and in the presence of gamma the two are different.

If you don't believe that, try passing multi-burst through a gamma convertor, a box that converts between HLG and PQ or SDTV, and see what happens. As these formats have different gammas, the conversion must be non-linear and the result is not pretty. It's not a valid test, as a non-linear device doesn't have a conventional frequency response so it can't be tested.

How significant is this problem in practice? Well, resolution is limited by the weakest link in the chain, and in today's television standards the weakest link is the inadequate frame rate that causes smear in the presence of motion and makes high pixel counts pointless. High Definition describes the TV set, not the picture. The damage gamma does to resolution is largely concealed beneath the greater damage elsewhere. Interlace further concealed the problem in legacy formats.

In a typical video signal, smeared images occupy the bottom of the spectrum, leaving plenty of room above for the harmonics introduced by gamma.

Broadcast Bridge Survey

You might also like...

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Essential Guide: Next-Gen 5G Contribution

This Essential Guide explores the technology of 5G and its ongoing roll out. It discusses the technical reasons why 5G has become the new standard in roaming contribution, and explores the potential disruptive impact 5G and MEC could have on…

Comms In Hybrid SDI - IP - Cloud Systems - Part 1

We examine the demands placed on hybrid, distributed comms systems and the practical requirements for connectivity, transport and functionality.

Audio For Broadcast - The Book

​Audio For Broadcast - The Book gathers together 16 articles into a 78 page eBook which explores the science and practical applications of audio in broadcast.  This book is not aimed at audio A1’s, it is intended as a reference resource for …