What Is Advanced HDR By Technicolor?

In the beginning, there was Standard Definition, and we lived with it for a good many years. But we wanted more and better content. Then came High Definition, with six times the resolution, and an aspect ratio similar to the screen of a movie theater. It was good and we enjoyed it.

Technology does not stand still, and we want even better viewing experiences in the comfort of our living room. This time, many advances are competing for our attention: higher resolution (4K, 8K and beyond), more colors (Wide Color Gamut), higher frame rates (for fast-moving content), and High Dynamic Range (HDR), to create more vivid pictures.

These technologies do not actually compete with each other, instead they address different aspects of the viewing experience. Many people in the broadcast business, including the author of this article, believe that HDR is the technology that makes the most difference for the average viewer. We will focus on HDR and will discuss the technology behind Advanced HDR by Technicolor, which has been adopted as part of ATSC 3.0.

A Quick Overview Of HDR
Dynamic range is the ratio between the lowest and highest values of the luminance, or intensity of light emitted, from a display. Essentially, it is the ratio between the “whitest white” and “blackest black.” Conventional standard displays have a dynamic range of approximately 100:1, which is well below what the human eye can perceive. High Dynamic Range is simply a means to increase this ratio and better match the human perception, providing a more pleasant and realistic experience.

One important difference between Standard Dynamic Range (SDR) and HDR is that, while in SDR the luminance is relative (100% means “give me your best shot at white”), in most HDR formats the luminance is absolute and indicates actual light intensity, using a non-linear transfer function based on what the eye can perceive. This is the SMPTE ST 2084 Perceptual Quantizer (PQ). Most native HDR signals are in 10-bit PQ format, referred to as PQ10.

Since monitors may not be able to reach all luminance values (both dark and bright), some HDR formats include metadata to help the monitor perform the best match given its capabilities.

One HDR standard operates differently from the others. It is called “Hybrid Log Gamma” (HLG), developed by the BBC. Unlike PQ, it uses relative light intensity. It was designed in such a way that, when an HLG signal is applied to an SDR monitor, it will “look right”, and when applied to a monitor that understands the format, the HDR image comes out – without using any metadata. It is an interesting format, but the quality is not as good as the PQ-based HDR standards.

Figure 1: In terms of HDR live production, we are in a transition stage.

Figure 1: In terms of HDR live production, we are in a transition stage.

The issues listed in Figure 1 make live production in HDR a challenge – but there are solutions.

If you are working in HDR, the best approach is to produce everything in HDR, using PQ as your baseline signal. To do that, you must convert your legacy SDR signals and content to HDR, and then provide some solution to support both SDR and HDR emission. Advanced HDR by Technicolor has solutions for both these problems.

Ingesting SDR Into An HDR workflow
Converting SDR into HDR is not as simple as it sounds. This is somewhat like taking black-and-white content and “converting” it to color – the information you would like to add is just not there, and someone must make it up. That “someone” is normally a colorist, who looks at the scene and decides how to best display the scene.

The simplest way to convert SDR to HDR is a 3D Look-Up Table (3D-LUT). This simply maps each SDR pixel value using a fixed function to some other value that makes sense in HDR. It works, but the results are not that great.

The better alternative is to use Technicolor HDR Intelligent Tone Management (ITM). ITM does not do a fixed conversion, it analyzes each frame and chooses the best possible mapping dynamically, taking into account not just the frame itself, but the past frames as well.

In fact, the ITM algorithms were “trained” from the work of top colorists using AI and Machine Learning. The ITM process can run by itself, or it can be tweaked by directors who want to influence the process to match their artistic vision. Cobalt Digital has mapped the ITM processes into the 9904-UDX-4K card, where, unlike the colorist, it can run live and in real-time 24x7x365.

Outputting HDR And SDR Simultaneously
You have now produced your live HDR content. It looks amazing since you used HDR cameras, and you converted any SDR sources into HDR using ITM. Now you need to distribute it. If every receiver out there were capable of HDR playout, you would be done. However, they are not. You have a mix of SDR and HDR receivers. Moreover, some of the older SDR receivers are only capable of 8-bit decoding, not 10-bit. That is your next challenge.

The first step for any solution is to generate an SDR version of the HDR signal. Here, again, one could use a fixed 3D-LUT, and many do. Again, while it works at some basic level, the results are average. One good example is a sports match that starts during the day and concludes at night under illumination. A dynamic algorithm that changes the conversion based on content will certainly produce a better result than a fixed table.

The second step would be to generate a single signal that would be compatible with both SDR and HDR receivers, if possible. This needs to take into account that SDR devices are probably older and do not understand anything other than SDR.

Both these steps are achieved by SL-HDR1, which is part of Advanced HDR by Technicolor and implemented in real-time in the Cobalt 9904-UDX-4K card. SL-HDR1 has been adopted as part of ATSC 3.0 (A/341) and is an ETSI standard (TS 103 433-1). It works as follows:

  • The HDR signal is converted to an SDR representation, using a dynamic process that optimizes the quality.
  • As part of the conversion process, metadata is generated that allows a compatible receiver to recover the original HDR content from the SDR content.
  • An SDR device will not be aware of the metadata and will simply display the SDR content.

Inserting Metadata
In SDI signals, this metadata is inserted as ancillary data using SMPTE ST 2108. Legacy devices will ignore the metadata and simply display the SDR signal. HDR devices with SL-HDR1 support will reconstruct the HDR signal and display it. Conceptually, this is similar to what was done when color TV was introduced – the black-and-white signal was still there, and the color information was in a separate subcarrier, which was ignored by the older sets.

In compressed signals, the same metadata is inserted as registered user data in the video elementary stream. Legacy 10-bit devices will again simply ignore it and decode the SDR content, while compatible receivers will have the full HDR experience. For 8-bit HEVC devices, one option is to use Scalable HEVC (SHVC) with an 8-bit SDR base layer and an SDL-HDR1 enhancement layer. Some broadcasters are even trying 8-bit AVC with SL-HDR1 metadata. The final workflow diagram is at the top of this story.

Dr. Noronha is Executive Vice President of Engineering at Cobalt Digital.

You might also like...

Cloud Workflow Intelligent Optimization - Part 1

Optimization gained from transitioning to the cloud isn’t just about saving money, it also embraces improving reliability, enhancing agility and responsiveness, and providing better visibility into overall operations.

Managing 2022

The new year is a time to ponder the past and muse about the future. In the past, nearly each technical device needed to produce broadcast TV cost more than building a new house, was as huge as it was…

As Bandwidth Demands Grow, Streaming Industry Looks To VVC

The most recent Annual Video Developer Report released by Bitmovin, a provider of cloud computing and streaming technology for media distribution, finds that for the first time ever usage of the H.264 codec has dropped, from 92% to 83% (of a pool…

CDN Innovation And 5G Combine With Covid To Drive OTT Boom

The last year has witnessed unprecedented growth in OTT across the whole video chain from production through contribution to distribution, driven not just by the Covid-19 pandemic but also intersection of key technological and societal developments.

Cloud Microservice Workflow Design - Part 2

In the second of the articles in this cloud microservices workflow series, we take a deeper look at how microservices are integrated into workflows using RESTful APIs, as well as reviewing storage deployment and accessibility “on your terms”.