High Dynamic Range (HDR) has been getting a lot of attention lately. Dynamic range is the ratio between blackest black and the brightest white that can be seen on a display. High Dynamic Range is the next major step in improving television pictures.
We have been on a path to improving video pictures since television broadcasting began in the early 1930s. Black and white TV was followed by the introduction of color in 1953. Today’s ATSC digital broadcast standard was established in 1998 with added resolutions of 720p and 1080i. This was well before HDR was proposed. The second-generation digital broadcast standard, ATSC 3.0 adds many new capabilities including transmission of HDR, 1080p, UHD/4K and Wide Color Gamut.
Unfortunately, there are still misconceptions about HDR. Much of today’s HDR discussion is connected to UHD/4K. This is mostly due to the consumer television industry. If you want to buy a new TV with HDR, it will also be a 4K TV. However, HDR is independent of resolution and it applies equally to HD, 4K and 8K.
High Dynamic Range also improves High Definition video just like it does in higher resolutions. The Average Picture Level (APL) is similar to SDR. Only the peak video is brighter. HDR is more than brighter pictures. The biggest benefit is more detail in mid-tones and dark areas of the picture. This new technology is not another format war. It’s a range of HDR standards that all accomplish similar things.
One of the important HDR profiles is Hybrid Log Gamma (HLG) developed for live TV. HLG can be more compatible than PQ with modern SDR TVs. Finally, some people assume video HDR is created like still-image HDR. Still image HDR takes multiple exposures and combines them to create the best of all exposures. Video is running at a frame rate up to 60 exposures per second. There is no time to combine multiple images. Instead, video HDR uses one of several different Transfer Functions, or gamma curves.
Human Visual System
HDR is a major improvement in visible picture performance because it more closely matches what the human eye sees in real life. In a single scene without adaption, the eye can see about 12 -15 f stops with a contrast ratio of about 30,000:1 and that’s what HDR is capable of.
However, the complete range of sensitivity of the Human Visual System (HVS) is non-linear and enormous at about 24 stops or 1,000,000:1. The combination of the eye and brain allows us to see starlight to sunlight. We will never be able to produce that kind of dynamic range in TV displays.
The human visual system’s raw image from the retina is actually rather poor. It’s full-color and detailed in the center, but at the periphery the image become less detailed and monochromatic. However, the visual cortex does some very impressive processing. It’s the brain that creates a 360-degree, 3D representation of the space around us, and all in detailed color.
R.BT.2100 Technical Standard For HDR
All profiles of HDR conform to this International Telecommunication Union (ITU) recommended standard. The HDR recommendation is not specific to any resolution or HDR profile. This recommendation specifies HDR video image parameters for use in production and international program exchange using the Perceptual Quantization (PQ) and Hybrid Log-Gamma (HLG) methods.
The preface of the ITU R.BT.2100 standard document is well-written. It says: “HDR provides substantially increased [peak] display brightness offering detail in highlights and reflecting objects, it also provides greater detail in dark areas. The HDR image formats should have, where appropriate, a degree of compatibility with existing workflows and infrastructure. Modern displays are capable of reproducing images at a higher luminance, greater contrast ratio and wider color gamut than is conventionally employed in program production.” It refers to the limited performance of the camera and display technology based on an 85-year-old standard for CRT brightness. Today’s SDR TVs and monitors can do much more. In an industry that has moved so quickly in some other areas, it’s a surprise that this old and obsolete specification is still our video production standard.
In the simulated pictures, the left SDR image is overexposed in the sky and there is little detail or color. The red circle shows the missing TV antenna that we can see in the HDR picture along with the detail in the clouds. This has a much different feeling than the flat, blown-out sky on the left. Some detail is also lost in shadows under the trees along the street. The actual HDR picture would be about 10 times brighter in the sky and peak highlights. HDR produces superior highlight handling and improved detail in mid-tones. The greater contrast produces a sharper subjective picture with improved contrast and more saturated color, as seen in the blue parking sign.
An important consideration related to the value HD and HDR is TV viewing distance. World-wide, the average TV viewing distance is about 9 feet or 3 meters. This is too far away to see the full resolution of UHD/4K. The distance of 9 feet was established in the 1970s by RCA Labs. What is a little surprising is that back then we were watching 21 to 27” CRT TVs. Yet, the viewing distance has not changed. For comparison, UHD/4K’s maximum viewing distance is 1.5 picture heights. That’s about 4 feet from a 65” TV. The optimum viewing distance for UHD/4K at 9 feet would require a 105” TV. Some years ago, Samsung marketed a 105” 4K TV for $120K. As you might imagine, it was not a big seller and has been discontinued. It would be cheaper to sit closer! For some reason we don’t …and probably never will.
The significance of viewing distance is that HDR, unlike resolution, is easily visible at average viewing distances. In addition, HD-HDR doesn’t require significant changes in workflow, storage or bandwidth. HD-HDR at 1080p is just a fraction of the bandwidth requirements for UHD/4K.
UHD - Increased spatial resolution. Mostly driven by Consumer Electronics. As resolution goes up, motion blur becomes more visible, requiring a Higher Frame Rate. Doubling the frame rate doubles the bit rate.
HDR - High Dynamic Range: More detail in Blacks, better highlights without clipping, most importantly, expanded mid tones. Increased contrast has the appearance of sharper pictures with more saturated colors.
Wide Color Gamut: ST 2020 & DCI-P3. Better color, especially greens. Wide Color Gamut make a very noticeable difference in high chroma and high brightness, like metallics.
Bit Depth - Reduction of contouring or banding artifacts in tone and color.
Remember, “Bits are Bucks”. The greater the bit rate the more it costs and HDR provides the biggest bang for the buck.
In the next installment, we will discuss all the details of HDR in Part 2, Advanced HDR.
You might also like...
The recent launch of Apple’s TV Plus service bulked up with original TV shows costing $6 billion to produce has disrupted global attempts to unify streaming behind a common set of protocols for encoding, packaging, storing and playing back video d…
In the data recording or transmission fields, any time a recovered bit is not the same as what was supplied to the channel, there has been an error. Different types of data have different tolerances to error. Any time the…
In the good old days when you were thinking about upgrading your computer you began by reading printed reviews such as those published by Byte magazine. These reviews usually included industry standard benchmarks. Now, of course, you are far more…
With 6K acquisition becoming more common, you may be considering getting ahead of the editing curve by upgrading your computer system. Likely you’ll want a hot system based upon one of the new AMD or Intel 6- or 8-core m…
It is almost a hundred years since the color space of the human visual system was first explored. John Watkinson looks at how it was done.