AJA & Colorfront Answer Questions on HDR Production

Many broadcasters and sports production companies are migrating to HDR production. However, this move is not straightforward. Just as the move from 4:3 to 16:9 raised many issues, the move to a high dynamic range (HDR) and a wider color gamut (WGC) as defined in ITU standards Rec.2020 and Rec.2100 present many issues, the more so when simulcasting in the legacy Rec.709 HDTV standard. We all remember 3D, when it was immediately seen that two mobile units, one 2D, one 3D were not financially viable. Likewise, with HDR, any production will have to output legacy HD for what will be the vast majority of viewers in the immediate future, and a full UHD output with HDR and WGC for early adopters of the latest home displays. AJA, known for their wide range of conversion products that can solve many a system problem, released a new product in their FS line of universal converter/frame synchronizer/converter. The FS-HDR provides real time HDR/WCG Conversion and incorporates the Colorfront Engine™ for video processing. The AJA team has now amassed a great deal of knowledge on the knotty problems of HDR/WGC production. The Broadcast Bridge spoke recently with the development team to answer some key questions.

Q. Can HDR be used in a regular 10-bit quad 3G or 12G-SDI system?

A. Yes, and in fact even though HDR is defined for 10- and 12-bit systems, the consumer will generally get 10-bit which can provide stunning images with real world scenes. While you may get a better picture with 12-bit, in the end delivery will generally be coded 10-bit. The Dolby Vision system supports 12-bit over HDMI. Using 12 bits ensures that the quantization steps are below the visual threshold of the Barten Ramp.

The Barten ramp indicate the contrast sensitivity of the eye at different luminance levels. Coding above the threshold may exhibit visible quantization steps.
The PQ standard follows the shape of Barten threshold, just above for 10-bit coding, just below for 12-bit (the ideal). Note how legacy standards are unsuited to 10-bit coding (1886 line).

Q. What are the issues in using a camera-log input for HDR-10 output?

A. Some cameras don’t have paint controls with their camera-log outputs available, so there may be an issue with certain models but generally there aren’t any major problems. Most cameras offer log output that supports the full range of what the sensor can capture, and in mapping you can come up with the Optical-to-Optical Transfer Function (OOTF) to define how you want the image to look for translation to display or distribution format. Whether PQ (ST 2084) or HLG (BT.2100) output—both will look the same on a HDR display because essentially, they’re just different ways of bending and mapping pixels to digital values and translating them for recovery at the TV or display stage.

Q. Some early adopters are using consumer HDR displays in primary positions in the production monitor stack. Is a mix of SDR and HDR displays going to cause problems in production?

A. No, this won’t cause problems with the right combination of hardware and software—and the workflow combining the use of the Colorfront Engine™ with AJA’s FS-HDR was developed to address this. Most early adopters will use SDR monitors for confidence and HDR monitors for getting the picture right. This is not a problem as long as both monitors show the correct picture. With the single master workflow, both pictures and creative look will match, and that’s been proven in a number of existing workflows. Any place in the production chain where you have an HDR monitor, the same picture can be matched for SDR monitors with a common master. For example, if your mezzanine working format is HLG, then it’s a basic translation inside of a hardware interface such as AJA’s FS-HDR to come up with matching Rec 709 SDR or ST 2084 HD signals. The same can occur in the control room where your master program monitor is HDR—if that monitor is being fed a signal through FS-HDR, you have the ability to see what the HDR looks like and immediately click over to see an SDR preview without changing monitors.

Q. Shading positions are likely to have grade 1 HDR monitors. Will the vision engineers need an HDR and an SDR monitor for shading?

No, with the FS-HDR Colorfront engine you can derive beautiful HDR and SDR with a single master workflow from either an HDR or an SDR monitor, though an HDR monitor would be ideal. With a single master, you can easily toggle between HDR and an SDR 709 feed and optimize camera shading for one or the other. The FS-HDR Colorfront engine was designed to facilitate a single master workflow so that vision engineers can see both SDR and HDR from a single monitor and be able to come up with a common look that will match and serve both optimally.

Q. In post, the SDR deliverable goes through a trim pass in grading. How is live production going to operate when creating HDR and SDR simultaneous outputs?

A. There is no trim pass in live production so the single master workflow delivers a lot of efficiencies in these scenarios. It is always best to master in HDR to ensure that color looks optimized whether final broadcast delivery is SDR or HDR.

Q. There are several ways to convert from wide color gamut to 709-color gamut, perceptual conversion, colormetric, saturation etc. In an automated process, can this be algorithmically managed?

A. Yes, this can be algorithmically managed. The Colorfront engine uses gamut mapping and light level remapping to translate colors properly into a perceptual space that matches the original creative intent. The engine will remap to the closest match possible, and if substitute colors have to be generated based on the target color volume, they will retain the original creative intent or look.

Conversely, you can turn this around to an equally common scenario of starting with a 709 feed and scaling up to HDR. You can end up with top quality HDR from an original SDR feed with perceptual expansion to increase dynamics and preserve original artistic intent in a picture.

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

Producing News in 360-degree

360-degree video is hot. Global 360-degree camera sales are expected to grow at an impressive CAGR of more than 35% through 2020. When will 360-degree news production begin? It’s happening now say some experts.

H.264 Versus HEVC: Understanding the Differences

4K imagery has become the quality standard for many broadcast applications. A key requirement is that the transmission links be of sufficient bandwidth. Links using H.264 can be overwhelmed by the much higher bandwidth requirements of 4K video. HEVC is…

Smartphone Journalism

In the five months since The Broadcast Bridge published Frank Beacham’s article discussing Using the iPhone for Professional Video there have been many world events that offered ideal opportunities to cover events with smartphones rather than traditional ENG camcorders. O…

Articles You May Have Missed – November 8, 2017

More pixels, more audio channels and increased complexity. Those are some of the challenges facing today’s broadcast and media engineers. In this week’s review of technology briefs, we first examine a prediction of 8K cameras being used for the…

RTL’s New Luxembourg Headquarters is an IP Broadcast Game Changer

In a time of uncertainty among many parts of the broadcast industry, Broadcasting Center Europe (BCE), part of the RTL Group, a Luxembourg-based media conglomerate that operates TV and radio channels as well as production companies located throughout Europe and…