AJA & Colorfront Answer Questions on HDR Production

Many broadcasters and sports production companies are migrating to HDR production. However, this move is not straightforward. Just as the move from 4:3 to 16:9 raised many issues, the move to a high dynamic range (HDR) and a wider color gamut (WGC) as defined in ITU standards Rec.2020 and Rec.2100 present many issues, the more so when simulcasting in the legacy Rec.709 HDTV standard. We all remember 3D, when it was immediately seen that two mobile units, one 2D, one 3D were not financially viable. Likewise, with HDR, any production will have to output legacy HD for what will be the vast majority of viewers in the immediate future, and a full UHD output with HDR and WGC for early adopters of the latest home displays. AJA, known for their wide range of conversion products that can solve many a system problem, released a new product in their FS line of universal converter/frame synchronizer/converter. The FS-HDR provides real time HDR/WCG Conversion and incorporates the Colorfront Engine™ for video processing. The AJA team has now amassed a great deal of knowledge on the knotty problems of HDR/WGC production. The Broadcast Bridge spoke recently with the development team to answer some key questions.

Q. Can HDR be used in a regular 10-bit quad 3G or 12G-SDI system?

A. Yes, and in fact even though HDR is defined for 10- and 12-bit systems, the consumer will generally get 10-bit which can provide stunning images with real world scenes. While you may get a better picture with 12-bit, in the end delivery will generally be coded 10-bit. The Dolby Vision system supports 12-bit over HDMI. Using 12 bits ensures that the quantization steps are below the visual threshold of the Barten Ramp.

The Barten ramp indicate the contrast sensitivity of the eye at different luminance levels. Coding above the threshold may exhibit visible quantization steps.
The PQ standard follows the shape of Barten threshold, just above for 10-bit coding, just below for 12-bit (the ideal). Note how legacy standards are unsuited to 10-bit coding (1886 line).

Q. What are the issues in using a camera-log input for HDR-10 output?

A. Some cameras don’t have paint controls with their camera-log outputs available, so there may be an issue with certain models but generally there aren’t any major problems. Most cameras offer log output that supports the full range of what the sensor can capture, and in mapping you can come up with the Optical-to-Optical Transfer Function (OOTF) to define how you want the image to look for translation to display or distribution format. Whether PQ (ST 2084) or HLG (BT.2100) output—both will look the same on a HDR display because essentially, they’re just different ways of bending and mapping pixels to digital values and translating them for recovery at the TV or display stage.

Q. Some early adopters are using consumer HDR displays in primary positions in the production monitor stack. Is a mix of SDR and HDR displays going to cause problems in production?

A. No, this won’t cause problems with the right combination of hardware and software—and the workflow combining the use of the Colorfront Engine™ with AJA’s FS-HDR was developed to address this. Most early adopters will use SDR monitors for confidence and HDR monitors for getting the picture right. This is not a problem as long as both monitors show the correct picture. With the single master workflow, both pictures and creative look will match, and that’s been proven in a number of existing workflows. Any place in the production chain where you have an HDR monitor, the same picture can be matched for SDR monitors with a common master. For example, if your mezzanine working format is HLG, then it’s a basic translation inside of a hardware interface such as AJA’s FS-HDR to come up with matching Rec 709 SDR or ST 2084 HD signals. The same can occur in the control room where your master program monitor is HDR—if that monitor is being fed a signal through FS-HDR, you have the ability to see what the HDR looks like and immediately click over to see an SDR preview without changing monitors.

Q. Shading positions are likely to have grade 1 HDR monitors. Will the vision engineers need an HDR and an SDR monitor for shading?

No, with the FS-HDR Colorfront engine you can derive beautiful HDR and SDR with a single master workflow from either an HDR or an SDR monitor, though an HDR monitor would be ideal. With a single master, you can easily toggle between HDR and an SDR 709 feed and optimize camera shading for one or the other. The FS-HDR Colorfront engine was designed to facilitate a single master workflow so that vision engineers can see both SDR and HDR from a single monitor and be able to come up with a common look that will match and serve both optimally.

Q. In post, the SDR deliverable goes through a trim pass in grading. How is live production going to operate when creating HDR and SDR simultaneous outputs?

A. There is no trim pass in live production so the single master workflow delivers a lot of efficiencies in these scenarios. It is always best to master in HDR to ensure that color looks optimized whether final broadcast delivery is SDR or HDR.

Q. There are several ways to convert from wide color gamut to 709-color gamut, perceptual conversion, colormetric, saturation etc. In an automated process, can this be algorithmically managed?

A. Yes, this can be algorithmically managed. The Colorfront engine uses gamut mapping and light level remapping to translate colors properly into a perceptual space that matches the original creative intent. The engine will remap to the closest match possible, and if substitute colors have to be generated based on the target color volume, they will retain the original creative intent or look.

Conversely, you can turn this around to an equally common scenario of starting with a 709 feed and scaling up to HDR. You can end up with top quality HDR from an original SDR feed with perceptual expansion to increase dynamics and preserve original artistic intent in a picture.

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

Articles You May Have Missed – August 22, 2018

A search engine shows the term, “Video IP” has more than 66 million hits. While the phrase may be popular, only a small percentage of those sites are able to add to the professional technical knowledge base needed to build and man…

Extending The Studio Network To The Field

It is time to implement IP based bidirectional and multi-user radio systems in the licensed BAS band channels. The resulting improvements in data rates and new technology can enable workflows in the field much like those enjoyed in the studio.…

Articles You May Have Missed – August 8, 2018

With IBC 2018 just a month away, it is time to consider the type of technology on which to focus. One technology battle brewing amongst giants concerns on three major cloud providers, Amazon Web Services, (AWS), Google Cloud Platform (GCP) and…

Sony Virtual Production Service Launched at Red Bull Event

Although OTT delivery has created a mature market for on-demand scripted shows that leverages the public internet for distribution, the ever increasing and IP-enabled bandwidth available that uses public wireless networks and the public cloud, is opening a new market…

Articles You May Have Missed – July 4, 2018

With World Cup Soccer top of mind, it might be a good time to review both camera design and audio capture technologies. How cameras are packaged has been a decided art for decades, but newer technologies like mirrorless capture and…