What Is Advanced HDR By Technicolor?
Converting HDR metadata to ancillary data allows production, transmission and display in SDR and HDR. Courtesy Cobalt Digital.
In the beginning, there was Standard Definition, and we lived with it for a good many years. But we wanted more and better content. Then came High Definition, with six times the resolution, and an aspect ratio similar to the screen of a movie theater. It was good and we enjoyed it.
Technology does not stand still, and we want even better viewing experiences in the comfort of our living room. This time, many advances are competing for our attention: higher resolution (4K, 8K and beyond), more colors (Wide Color Gamut), higher frame rates (for fast-moving content), and High Dynamic Range (HDR), to create more vivid pictures.
These technologies do not actually compete with each other, instead they address different aspects of the viewing experience. Many people in the broadcast business, including the author of this article, believe that HDR is the technology that makes the most difference for the average viewer. We will focus on HDR and will discuss the technology behind Advanced HDR by Technicolor, which has been adopted as part of ATSC 3.0.
A Quick Overview Of HDR
Dynamic range is the ratio between the lowest and highest values of the luminance, or intensity of light emitted, from a display. Essentially, it is the ratio between the “whitest white” and “blackest black.” Conventional standard displays have a dynamic range of approximately 100:1, which is well below what the human eye can perceive. High Dynamic Range is simply a means to increase this ratio and better match the human perception, providing a more pleasant and realistic experience.
One important difference between Standard Dynamic Range (SDR) and HDR is that, while in SDR the luminance is relative (100% means “give me your best shot at white”), in most HDR formats the luminance is absolute and indicates actual light intensity, using a non-linear transfer function based on what the eye can perceive. This is the SMPTE ST 2084 Perceptual Quantizer (PQ). Most native HDR signals are in 10-bit PQ format, referred to as PQ10.
Since monitors may not be able to reach all luminance values (both dark and bright), some HDR formats include metadata to help the monitor perform the best match given its capabilities.
One HDR standard operates differently from the others. It is called “Hybrid Log Gamma” (HLG), developed by the BBC. Unlike PQ, it uses relative light intensity. It was designed in such a way that, when an HLG signal is applied to an SDR monitor, it will “look right”, and when applied to a monitor that understands the format, the HDR image comes out – without using any metadata. It is an interesting format, but the quality is not as good as the PQ-based HDR standards.
The issues listed in Figure 1 make live production in HDR a challenge – but there are solutions.
If you are working in HDR, the best approach is to produce everything in HDR, using PQ as your baseline signal. To do that, you must convert your legacy SDR signals and content to HDR, and then provide some solution to support both SDR and HDR emission. Advanced HDR by Technicolor has solutions for both these problems.
Ingesting SDR Into An HDR workflow
Converting SDR into HDR is not as simple as it sounds. This is somewhat like taking black-and-white content and “converting” it to color – the information you would like to add is just not there, and someone must make it up. That “someone” is normally a colorist, who looks at the scene and decides how to best display the scene.
The simplest way to convert SDR to HDR is a 3D Look-Up Table (3D-LUT). This simply maps each SDR pixel value using a fixed function to some other value that makes sense in HDR. It works, but the results are not that great.
The better alternative is to use Technicolor HDR Intelligent Tone Management (ITM). ITM does not do a fixed conversion, it analyzes each frame and chooses the best possible mapping dynamically, taking into account not just the frame itself, but the past frames as well.
In fact, the ITM algorithms were “trained” from the work of top colorists using AI and Machine Learning. The ITM process can run by itself, or it can be tweaked by directors who want to influence the process to match their artistic vision. Cobalt Digital has mapped the ITM processes into the 9904-UDX-4K card, where, unlike the colorist, it can run live and in real-time 24x7x365.
Outputting HDR And SDR Simultaneously
You have now produced your live HDR content. It looks amazing since you used HDR cameras, and you converted any SDR sources into HDR using ITM. Now you need to distribute it. If every receiver out there were capable of HDR playout, you would be done. However, they are not. You have a mix of SDR and HDR receivers. Moreover, some of the older SDR receivers are only capable of 8-bit decoding, not 10-bit. That is your next challenge.
The first step for any solution is to generate an SDR version of the HDR signal. Here, again, one could use a fixed 3D-LUT, and many do. Again, while it works at some basic level, the results are average. One good example is a sports match that starts during the day and concludes at night under illumination. A dynamic algorithm that changes the conversion based on content will certainly produce a better result than a fixed table.
The second step would be to generate a single signal that would be compatible with both SDR and HDR receivers, if possible. This needs to take into account that SDR devices are probably older and do not understand anything other than SDR.
Both these steps are achieved by SL-HDR1, which is part of Advanced HDR by Technicolor and implemented in real-time in the Cobalt 9904-UDX-4K card. SL-HDR1 has been adopted as part of ATSC 3.0 (A/341) and is an ETSI standard (TS 103 433-1). It works as follows:
- The HDR signal is converted to an SDR representation, using a dynamic process that optimizes the quality.
- As part of the conversion process, metadata is generated that allows a compatible receiver to recover the original HDR content from the SDR content.
- An SDR device will not be aware of the metadata and will simply display the SDR content.
Inserting Metadata
In SDI signals, this metadata is inserted as ancillary data using SMPTE ST 2108. Legacy devices will ignore the metadata and simply display the SDR signal. HDR devices with SL-HDR1 support will reconstruct the HDR signal and display it. Conceptually, this is similar to what was done when color TV was introduced – the black-and-white signal was still there, and the color information was in a separate subcarrier, which was ignored by the older sets.
In compressed signals, the same metadata is inserted as registered user data in the video elementary stream. Legacy 10-bit devices will again simply ignore it and decode the SDR content, while compatible receivers will have the full HDR experience. For 8-bit HEVC devices, one option is to use Scalable HEVC (SHVC) with an 8-bit SDR base layer and an SDL-HDR1 enhancement layer. Some broadcasters are even trying 8-bit AVC with SL-HDR1 metadata. The final workflow diagram is at the top of this story.
Dr. Noronha is Executive Vice President of Engineering at Cobalt Digital.
You might also like...
Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs
Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.
HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG
HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.
What Does Hybrid Really Mean?
In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.
HDR & WCG For Broadcast - HDR Picture Fundamentals: Color
How humans perceive color and the various compromises involved in representing color, using the historical iterations of display technology.
The Streaming Tsunami: Testing In Streaming Part 2: The Quest For “Full Confidence”
Part 1 of this article explored the implementation of a Zero Bug policy for a successful Streamer like Channel 4 (C4) in the UK, and the priorities that the policy drives. In Part 2 we conclude with looking at how Streamers can move…