​How to Quality Control for HDR

What are the impacts of High Dynamic Range on the Quality Control process? Given the confusion in the industry surrounding the different formats, the impact is greater than you’d think and still being worked through. The Broadcast Bridge took the advice of leading QC tools developers.

HDR, of course, all starts with filming in the correct format and flows from there: material needs to be high dynamic range (10-bits is the minimum but realistically 12-bits), wide colour gamut, the right colour transform, and in some cases: provision of data about the display device used for mastering, maximum content lightness level (MaxCLL) and maximum average frame light level (MaxFALL).

In general, the earlier in a workflow you can set-up and check HDR the better. In particular, HDR set-up at acquisition is important and can be complex in situations where there are a mix of cameras with different specifications and characteristics are being used.

Following acquisition, post production offers the next best opportunities to adjust for HDR and color balance.

Some of these parameters can only be calculated when the content is finished (after editing and grading) so the final check for HDR compliance can only be at the end when the file is ready for delivery, as HDR can require that this metadata is inserted into the media file (depending upon the format).

According to Thomas Dove, Director of Technology, QC Products,  Telestream, most people in the broadcast industry have an overall understanding of what HDR means and can recite the buzzwords, “however actual implementation and generation of HDR content is truly understood by very few and there is hardly any HDR content available right now.”

This is for four reasons, believes Dove. Firstly, HDR requires new concepts to be understood - and some aspects of these new concepts are perhaps counterintuitive, particularly during post production (as just one example: increasing overall brightness of a scene may have zero effect on the scene as the display device will limit this).

HDR production is not well understood, he says. “The theory is fine but what are the actual practices that must be adopted on-set for filming and viewing rushes.

Thirdly the post production is actually a lot more complex - it really requires thorough understanding of colour spaces and grading and needs different grades for HDR and SDR.”

Lastly, the tools are just not there yet. Currently there is no single post production software which can be used to take camera footage and produce an HDR compliant output.

“This now requires the use of at least three different software packages just to produce a graded HDR output with the correct metadata, and the most commonly-used NLE software does not provide all the tools required for HDR media file production. There is only one professional monitor feasible to use for HDR grading and this costs well over $30,000; there are only a few software packages that will calculate MaxFALL and MaxCLL (and not the ones that might be expected). This situation will doubtless improve in 6 months time, but this is where we are now.”

Tektronix agrees that there’s a degree of confusion about the various HDR standards and how these interact with the various characteristics of available cameras.

For example, says Ian Valentine, Director Business Development, Video Test at Tektronix, the PQ gamma curve has been used extensively in packaged consumer products (e.g. Blu-ray) and only works with PQ compatible displays. HLG (Hybrid Log-gamma) is used for live production and is Rec. 709 compatible. HLG can be used with a standard display but for best results you’ll need a display that supports HLG.

“Since there are so many HDR standards (HDR10/HLG/PQ/Dolby Vision, etc.) what often becomes critical is the OETF (optical electrical transfer function) which will ultimately govern the appearance of the image on the screen. Multiple OETFs have to be managed in mastering and post production for consistency in color and scene matching where multiple cameras have been used in acquisition. This is an important area of focus for Tektronix.”

Indeed, Tektronix WFM8300 waveform monitors and rasterizers include a series of special HDR graticules that allow users to correctly set camera white points and adjust specular highlights to create stunning looking content.

One of the key graticules is specifically setup for 18% grey (20NITS) and 90% reflective white (100NITS) for easy adjustments to correct grey and white points of content which is essential in HDR application, explains Dove. Other HDR capabilities include support for both SMPTE ST 2084 standard and HLG as per ARIB ST B-67 standard. HDR capabilities can easily be added to the existing WFM / WVR8000 series products with the purchase of Option PROD (a field upgradeable software-only option).

You might also like...

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Audio For Broadcast - The Book

​Audio For Broadcast - The Book gathers together 16 articles into a 78 page eBook which explores the science and practical applications of audio in broadcast.  This book is not aimed at audio A1’s, it is intended as a reference resource for …

Project Managing The Creative Elements Of Live Sports Production

Huw Bevan is an Executive Producer, Consultant and Head of Cricket for Sunset+Vine, in London, one of the UK’s leading independent sports production companies that produces a full slate of rugby, soccer and cricket events each year. This…

Standards: Part 4 - Standards For Media Container Files

This article describes the various codecs in common use and their symbiotic relationship to the media container files which are essential when it comes to packaging the resulting content for storage or delivery.