Everyone shopping for a 4K UHD television set these days is faced with the same question. What is high dynamic range, or HDR, should I buy it and what are the different flavors of it? What are the practical advantages? Here’s some guidance.
Even many of the lowest cost television sets advertise HDR these days. Many TV set dealers call it a “must have” feature, but rarely ever explain it well. If a set has HDR, it is usually HDR10. But even better is another form of HDR, called Dolby Vision, which delivers a far superior picture.
Quite simply, HDR reproduces a greater dynamic range of luminosity than is possible with standard imaging. It’s aim is to present a similar range of luminance to that experienced through the human visual system. When 4K television became mainstream, there was a need to extend its dynamic range to make home consumers feel they were getting a better entertainment experience. Higher resolution alone wasn’t enough to drive the sales of 4K TV sets.
For that reason, HDR10, was announced on August 27, 2015 by the Consumer Technology Association. It uses the wide-gamut Rec. 2020 color space, a bit depth of 10-bits and the SMPTE ST 2084 (PQ) transfer function – a combination later also standardized.
It also uses SMPTE ST 2086 "Mastering Display Color Volume" static metadata to send color calibration data of the mastering display, as well as MaxFALL (Maximum Frame Average Light Level) and MaxCLL (Maximum Content Light Level) static values, encoded as SEI messages within the video stream. That static metadata limitation is an important difference in the first standard.
HDR10 is an open standard supported by a wide variety of companies, which includes monitor and TV manufacturers such as Dell, LG, Samsung, Sharp, Sony, and Vizio. It is the most common HDR standard now found in new 4K television sets.
An improved standard, HDR10 Plus, was announced on April 20, 2017 by Samsung and Amazon Video. It updates HDR10 by adding dynamic metadata which is based on Samsung application SMPTE ST 2094-40. The dynamic metadata is additional data that can be used to more accurately adjust brightness levels on a scene-by-scene or frame-by-frame basis. HDR10+ is also an open standard and is royalty-free.
For the best technology available today, there is Dolby Vision, an HDR format from Dolby Laboratories that can be optionally supported by Ultra HD Blu-ray discs and streaming video services. Dolby Vision is a proprietary format and the royalty cost for Dolby Vision is less than $3 per TV.
Dolby Vision includes the Perceptual Quantizer (SMPTE ST 2084) electro-optical transfer function, up to 4K resolution and a wide-gamut color space (ITU-R Rec. 2020). The main two differences from HDR10 is that Dolby Vision has a 12-bit color depth and dynamic metadata. The color depth allows up to 10,000-nit maximum brightness (mastered to 4,000-nit in practice).
Dolby Vision can encode mastering display colorimetry information using static metadata (SMPTE ST 2086) but also provides dynamic metadata (SMPTE ST 2094-10, Dolby format) for each scene. TVs that support Dolby Vision include LG, TCL and Vizio. Apple’s 4K TV supports both HDR10 and Dolby Vision. The support of dynamic metadata is an important difference in Dolby Vision.
HLG, or Hybrid Log-Gamma, is another HDR standard jointly developed by the BBC and NHK. It is compatible with standard dynamic range (SDR) displays, although it requires 10-bit color depth.
HLG defines a non-linear electro-optical transfer function (EOTF) in which the lower half of the signal values use a gamma curve and the upper half of the signal values use a logarithmic curve. The HLG standard is royalty-free and is compatible with SDR displays.
HLG is supported by ATSC 3.0, Digital Video Broadcasting (DVB) UHD-1 Phase 2, HDMI 2.0b, HEVC and VP9. It is also supported by video services such as the BBC iPlayer, Freeview Play and YouTube.
No wonder HDR is so complicated for consumers trying to purchase a 4K TV set in stores. Unless you are an engineer, it is difficult to explain or even understand the differences. Today, however, Dolby Vision quite simply produces the best image quality of any of the standards (that could change, of course, in a month or so).
This is because of that dynamic metadata mentioned earlier. Dolby Vision operates off a metadata stream, which are instructions to the TV on how to render each frame of the image. HDR10, in its current form, outputs only one set of instructions at the beginning of the program and averages the rest for the show. That means there’s a lot of compromise in HDR10 and it’s not nearly as accurate.
Dynamic metadata is what rules HDR today. Dolby Labs is currently working with each manufacturer that uses its technology to get the best performance from the device it’s used in. That makes a big difference.
Since Dolby Vision is not free of charge, HDR10’s new plus version could eventually give Dolby a run for the money. But today, Dolby Vision is really the only game in town.
You might also like...
In 2017, at that year’s VidTrans conference a regional gathering of members of the Video Services Forum (VSF), a new protocol for delivering audio and video over lossy IP networks (including the public Internet), was born. It was an idea t…
SDI is one of those technologies that is so well established and ubiquitous it can almost be taken for granted.
Viewing audiences are continually driving broadcasters to deliver improved video formats to further enhance the immersive experience. It didn’t seem so long ago that HD was lauded as the best format ever. Not only did we end up quadrupling t…
In the last article in our three-part series, we explored the advantages of SDI and how 12G-SDI is applied in broadcast facilities. In this article, we investigate applications where SDI excels.
In the last article in this series we looked at how SDI has developed over the years to reach an incredible 47.52Gbits/sec for quad-link 12G. In this article, we dig deeper and uncover the technology enabling SDI and its…