LG debuted their Signature range of UHD LED receivers, supporting HDR10 and Dolby Vision for HDR.
Thinking back the recent CES show, the event is where you see the future of consumer technology. OK, 3D didn't work out, but its not the first time 3D has come and gone. CES 2016 may have headlined wearables and virtual reality, but the humble television is not forgotten. Last year it was 4K, this year it was better pixels: high dynamic range (HDR), wide colour gamut (WGC) and high frame rate (HFR).
When the silver light of the CRT first flickered into living rooms, the dynamic range was poor, just about acceptable. Tube cameras also had limited dynamic range, and the early solid-state cameras and flat screen displays were no better.
However, sensor technology has advanced in leaps and bounds, so that modern cameras can equal the performance of the former gold standard for dynamic range—film negative. Displays too have improved. The latest displays have higher peak brightness and deeper blacks. For the CE vendor, a bright and saturated display is essential. It stands out in the showroom against competitors, and gives better daylight viewing.
OLED and LCD
Display technology in the sets on sale falls into two camps. The more conventional designs build on existing LED-backlit LCDs. Backlight modulation increases the dynamic range beyond the capabilities of the LCD light valve. Some vendors are using quantum dots to enlarge the colour space, with many displays claiming P3 gamut, although the full 2020 colour space remains a way off. The current round of LCDs can produce impressive peak brightness (around 1 000 nits), but there are still the issues of black level and limits on viewing angle that come with the LCD valve.
The other camp is the OLED, an emissive display with naturally black blacks, not unlike the old plasma panels. OLED vendors claim better performance than LCD, but the technology is still expensive. Current OLEDs are not as bright as LCDs, but with a lower black level can achieve a similar dynamic range.
It seems consumer displays can now reproduce better pixels, but how do HDR, HFR, WCG images get from the camera and edit bay to the consumer? 24fps, 4K content is starting to become freely available, but the data rates of 50/60 fps content still challenge. Add to that the HDMI formats are barely keeping pace with the increases in data rates.
However, televisions are nothing without content. Broadcasters largely saw 3D as a cost with very little reward. Viewers were turned off by having to wear ‘sunglasses’ to watch television (mind you, they are nothing compared to donning a VR headset). So what about UHD?
What Do Viewers Want
The big question is “what do viewers want?” There is no doubt that displays are getting bigger. 60-inch diagonals are commonplace. However, displays over 60 inches really need UHD resolution to avoid the pixels being visible at normal viewing distances. So there is a clear case for increased resolution. But what about better pixels? If there is a demand for better pixels, then broadcasters can see a business case.
Since the inception of motion pictures, we have put up with motion aliasing from too low a sampling rate, 24 fps specifically. With television 50 and 60 field interlace has its own set of artefacts, interline twitter, jaggies, reduced vertical resolution. HFR solves both issues.
For sports, HFR is going to be a boon. Viewers can see an immediate benefit. Whether the film guys ever move from 24 fps is another matter. Many, especially in the Studio community, consider the artefacts of 24 fps an essential part of the film look.
Where it gets complicated is HDR and WGC. How will receivers render the video? What will a legacy set do with a WGC, HDR picture, and what will a future television do with a standard gamma, 709 gamut picture? The case of 4:3/16:9 was handled by wide screen signalling embedded in the transport stream.
Many of these questions will be answered over the next year or so, but right now there are gaps in the standards.
CE vendors tend to lead from the front. You can go out and purchase a receiver, which claims to be UHD with WDR, but standards are still being resolved. How will those receiver work with real HDR broadcasts?
The first receivers dubbed ‘UHD’ are just increased resolution. Rec. 2020 specifies the options of HFR, WGC and increased bit-depth; HDR remains to be added. What will happen to early adopters who find their TVs are not capable of handling HDR standards? Buy another one I guess.
Recently I had the opportunity to view HDR and WGC images. I do find WGC somewhat subtle, seen best in side-by-side comparisons, but HDR is obviously better than what we have now. However there are competing standards for the electro-optical transfer function (EOTF), Perceptual Quantisation (PQ) and hybrid log-gamma. There are arguments about the peak white levels. Is 1 000 nits sufficient, or 2 000, or should you aim for 10 000 nits? Right now consumer-priced display can only achieve around 1 000, so the arguments are more about future proofing standards.
Will brightness replace loudness as an area that will need legislation to prevent ad breaks being much brighter than the programs?
How will consumers react to all this new technology, how do you explain the new EOTF against CRT gamma? The simplest way is brighter whites and blacker blacks.
All this has been about the consumer, but where do these ‘better’ pixels come from? A few cameras can already deliver WGC and HDR raw data at higher frame rates. Getting the images through the production pipeline is predominately down to more processing power, wider bandwidth and more storage, which all costs. But it can all be done once transmission formats are standardised.
Some may remember 3D, little content, and those glasses, the pair they just found down the back of the sofa. Right now there is very little UHD content, some 4K movies OTT, and some pioneering sports broadcasts (BT in the UK). When 3D had its brief foray, some productions used two sets of trucks, 3D and 2D — a big cost for very few 3D viewers. Dual 4K/HD production is going to be another expensive route.
So there is much to resolve. The UHD Alliance (UHDA) has started by publishing a specification for a premium UHD receiver, which includes HDR. However, the ball really lies in the court of the content creators. It is already possible to shoot better pixels. It’s the bit in the middle that has to catch up. How do production techniques change — more wide shots, less cutting between cameras? And what of grading, when most viewers are watching legacy dynamic range?
In the playout arena, will brightness replace loudness as an area that will need legislation to prevent ad breaks being much brighter than the programs? And what about handling all the signalling metadata to control dynamic range, colour volume and frame rate. Even now, many receivers need a manual override to display the correct aspect ratio, and that is after a couple of decades of 16:9.
Related Editorial Content
Dolby Laboratories teamed with Envivio to showcase a live feed of Ultra HD video with High Dynamic Range (HDR) at IBC. The technology pairing used Muse video processing software and Dolby’s Dolby Vision and audio format Dolby AC-4.
High Dynamic Range (HDR) displays have arrived in the consumer and professional markets. Anyone in the post-production community who needs to calibrate these systems and certify to vendors and clients that their display systems are compliant with this technology will…
With its first live over-the-air broadcast of 4K UHD with high dynamic range (HDR) using the new ATSC 3.0 Candidate Standard, LG Electronics and GatesAir are ushering in the era of next-generation television broadcasting at CES 2016.