Momentum Builds Behind HDR+

Ultra HD (UHD) has gone through various iterations since it emerged from the shadow of 3D TV to become the broadcasting industry’s standard bearer for immersive or next generation TV. At first it was all about the higher resolution of 2160 x 3840, four times the pixel density even of 1080p “full HD”, but high frame rate (HFR) and Wide Color Gamut (WCG) at 10 bit sampling, rather than 8 bit as before, were also in the picture.

Subsequently High Dynamic Range (HDR) was added to the mix and quickly became one of the most prominent ingredients, because in consumer tests it appeared to deliver a greater improvement in perceived picture quality than the higher resolution. The latter is usually now referred to on its own as 4K, distinct from UHD which embraces all the other technology improvements, including object based audio as well.

A point quickly picked up by a number of vendors as well as broadcasters and operators was that HDR coupled with WCG increased the bit rate relatively slightly by 25% at most, compared with 4K which normally quadruples it in the absence of new compression methods such as HEVC. Even with advanced coding 4K generates about 2.5 times as many bits as full HD.

Another key component is the signaling used to transmit HDR from camera through the delivery infrastructure to TV or client device, the emphasis being on conveying as much contrast information as possible. The aim of HDR is to simulate the human eye’s ability to distinguish between huge contrast ranges by making blacks darker and light colors brighter on a screen in such a way that looks natural. But this requires an agreed way of converting light to electrical signals and transmitting faithfully through the ecosystem. This process is referred to by various names such as Electro Optical Transfer Function (EOTF) and perceptual HDR transfer function. These three standards have been bundled together as HDR+, which is being positioned as “UHD Lite” without the 4K or HFR, since both of those impose a heavy bandwidth penalty.

The most recent development is agreement within the ITU, the main body for standards related to HDR, over two recommendations for the EOTF. This will enable distribution of HDR+ programming over existing or emerging workflows, without substantial change, although HDR compliant TVs will be required to enjoy the full benefits. These standards in turn are being incorporated within the first Phase A of the Ultra HD Forum’s guidelines for end-to-end workflows involved in creating and delivering live and pre-recorded UHD content, published in April 2016.

This more or less coincided with the parallel launch of the Ultra HD Premium logo from the UHD Alliance, the other main standards development body for UHD. While the Alliance focuses on the two ends of the delivery chain, that is the cameras and CPE such as TVs, the Forum deals with all the infrastructure in between, including encoding, video transport and aspects of security. The Ultra HD Premium logo is supposed to guarantee that a compliant TV can display UHD pictures including HDR. For example Panasonic’s DX902 4K TV carries the badge, while its UB900 Blu-ray player is one of the first devices other than a television to be approved.

Then the Ultra HD Forum’s guidelines ensure that those pictures are delivered at the agreed quality to the TVs. The incorporation of the two EOTF standards is a major milestone because it opens the door to commercial UHD services that can harness the infrastructure and do justice to the growing number of TVs carrying the Ultra HD Premium logo.

HDR+ offers the greatest bang for the bit, according to Matthew Goldman, Senior Vice President Technology, TV Compression at Ericsson.

HDR+ offers the greatest bang for the bit, according to Matthew Goldman, Senior Vice President Technology, TV Compression at Ericsson.

Of the two standards for encoding and transmitting HDR pictures, the one most broadcasters would prefer to use is Hybrid Log-Gamma (HLG10), developed jointly by the BBC and NHK, because this one is backwards compatible with existing standard dynamic range (SDR) displays. This means that current TVs can benefit to some extent from the greater contrast, but the problem is that it will not work with many existing workflows, as was explained by Matthew Goldman, Senior Vice President Technology, TV Compression at Ericsson.

HLG10 includes metadata that enables the backwards compatibility and maximizes quality but is not accommodated by all existing workflow processes such as transcoding, file conversion and content management. For this reason a stripped down version called PQ (Perceptual Quantization)10 was developed.

“PQ10 is the core subset of HLG10 without the metadata,” said Goldman. “Then it can survive through existing live workflows. Both these are in the process of being standardized as a new ITU-R recommendation and are now in draft form without a number.” They are currently referred to as Draft Rec ITU-R BT.{HDR-TV}, with final ratification expected in July 2016. “The significance is that now we will have a standard way of distributing HDR content,” said Goldman. “It is unfortunate we have two ways of doing it, but there were originally six.”

There are still some issues to be resolved for HDR, such as how to deal with variations in ambient lighting. When viewing in a dark room the human eye can pick up subtler distinctions between shades of black than can currently be displayed even by Ultra HD Premium compliant TVs, especially LED displays. The point here is that there are in fact two variants of HDR, one for LED displays and the other for OLED models which have different picture characteristics. While LED TVs can display HDR images with better peak brightness, OLED TVs can display deeper blacks. To cater for these differences, LED TVs have to be capable of reaching 1,000 nits or more peak brightness and less than 0.05 nits black levels to be deemed HDR capable, while OLED TVs have the less stringent target of 540 nits brightness but more demanding 0.0005 nits black level.

This suggests that over time the standard for TV displays in general may become more rigorous but there are constraints, such as the fact that higher brightness levels require more energy. This may make it harder to meet environmental requirements, as well as draining batteries too quickly in portable devices like tablets.

Then coming back to 4K resolution there are issues around educating consumers as well as bandwidth. While Ericsson in particular is pushing HDR+ as a practical subset of full UHD as an interim solution, there is still the expectation that 4K will follow while TV ecosystems evolve the required capacity and capability. But as Goldman noted, 4K only improves the viewing experience at wider viewing angles, for otherwise the extra pixels are not appreciated.

“We have to educate the viewer to sit at the right viewing distance, otherwise the eye can’t see it,” said Goldman. “The human visual system can resolve roughly one arc minute (one 60th of a degree). That is roughly three picture heights back for HD resolution, but we have to sit twice as close to see 4K in its glory. To appreciate HDR+ you can be at the far end of room and still appreciate the extra contrast between bright and dark. Because the eye is so sensitive to contrast we think that is the real wow factor for TV, more than higher resolution.”

If that is the case many operators and broadcasters may be tempted to shelve plans to include 4K in their plans for immersive TV and concentrate on HDR+. There may then be a case for deploying higher frame rates before 4K as capacity allows, since that does improve the viewing experience for fast moving action in live sports in particular.

Comments:

It is unclear to me which three (3) standards, together, comprise HDR+ (UHD Lite).  Is it WCG, HDR, and EOTF that comprise HDR+ (UHD Lite)?

July 5th 2016 @ 20:54 by Jonathan Abrams

Since the above story was written the ITU formally released its standard for HDR, which has been referred to as HDR+ because it also includes WCG. So now HDR and WCG are both part of a common standard defining color range, depth and contrast. The two EOTF options are part of this new ITU HDR standard, which is called BT.2100. In 2015 the ITU ratified BT.2020 as the standard for WCG, the expanded color space for Ultra HD. BT.2100 is essentially BT.2020 enlarged by adding the specifications for HDR in addition to WCG.

July 7th 2016 @ 10:06 by Philip Hunter Editor
Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

Cloud Broadcasting - Free Software

As well as providing functionality, tangible products present the opportunity of adding worth through their aesthetic appearance, cost of manufacture and development expenditure adds to the perceived barrier to entry for other vendors, and combined with low volumes, the cost…

BCE Going Deeper - Part 1 - Cable, Standards and ITIL

At the start of 2013, BCE at RTL City was a hole in Luxembourg’s ground and in less than four years they were on air broadcasting 35 different channels across Europe and Singapore. Costas Colombus is BCE’s Special Projects Manager and…

Report: Ooyala, Automation and Production Nirvana

Investment in global TV poses a very interesting challenge – will there be enough content to satisfy the demand for OTT services?

Applied Technology: Signiant Media Shuttle

The Pitfalls of Online File Sharing and Sending Services for the Media & Entertainment Industry:

Cloud Broadcasting - Integration and Security - SSL Certificates

In the last article on Cloud Broadcasting we looked at integration and how we communicate with SaaS and cloud services in the absence of GPI’s and serial connections. In this article, we introduce secure server access and issues around s…