Better Pixels - Where Did It Start?

Industry watchers will be analysing the latest product releases at CES to see where consumer technology is heading. For the broadcast sector, UHD and HDR featured heavily amongst the TV-related releases at CES 2017. Still wary after the failure of 3D to take TV viewing by storm, will UHD prove more successful? Early indications are that it will, especially when there are no worries about losing expensive eyewear down the back of the sofa. Through this series I take a look at acquisition through to delivery for UHD content.

At CES 2017 Samsung showed their new metal quantum dot technology featured in the QLED series of LCD receivers.

At CES 2017 Samsung showed their new metal quantum dot technology featured in the QLED series of LCD receivers.

At CES 2017 receiver manufacturers were showing their latest offerings, 4K of course, and better pixels was important. Display technology split between OLED, with lower black levels, but peak white at around 500 nits, and LCDs with higher peak luminance and enhancements like quantum dots to improve colour performance.

Samsung launched the new QLED series, based on new metal Quantum Dot material. Samsung claim the LCD displays cover 100% of the DCI-P3 colour volume, with peak luminance between 1,500 and 2,000 nits.

LG had models with OLED and LCD displays. The SIGNATURE OLED TV W series majors on HDR and for audio buffs includes Dolby Atmos. The 'SUPER UHD' LCD line-up features their proprietary Nano Cell technology, which promises wider viewing angles, improved colour fidelity and reduced reflectivity for those viewing in high ambient light. LG HDR receivers support Dolby Vision, HDR-10 and HLG.

LG continues to enhance their OLED receivers.

LG continues to enhance their OLED receivers.

Sony showed new LCD displays, X930E/X940E, with Slim Backlight Drive+ to improve the dynamic range yet retain a slim profile. Again, supports Dolby Vision and HDR10. Sony are known professional OLED monitors but have concentrated on LCD for consumer displays. CES saw the debut of the new Bravia A1E series of 4K HDR displays using an OLED panel.

Panasonic announced the TH-EZ1000 OLED, which claims twice the brightness of earlier displays and DCI-P3 color space. Not expected until June 2017. 

To support 4K and future 8K displays, the HDMI Forum announced HDMI 2.1 at CES 2017. HDMI 2.1 supports for 8K60, as well as 4K120 with dynamic HDR metadata. To support these picture formats will be 48Gb/s cables, which will allow for transmission of uncompressed HDMI 2.1.

It is clear that the CE guys see 4K with expanded colour volume and extended dynamic range as essential in a premium receiver. 

Where Did It Start?

Super Hi-Vision kicked off the start of the biggest change in TV formats since the move to HD. Japanese broadcaster NHK started research into an ultra-high definition video system back in 1995, much in the way they pioneered HD. By 2000 they were working on a 4000 line video system with 3D audio.

In a quest for more immersive television, the initial premise was to increase the viewing angle to give more peripheral vision. That required an increase in pixel count from the 1920 x 1080 of HD to an interim 3,84,0 x 2160 and the final target of 7680 × 4320. The format included frame rates of 60 and 120, wider color gamut (WCG) than BT.709, and increased bit depths or 10 and 12 bits per sample. Other organisations followed the project, and the ITU eventually released BT.2020, embracing much of NHK’s concepts.

HDR

Viewing tests by Hollywood studios and others established that although viewers appreciate high frame rate (for some genres), WCG, and increased pixel count, the improvement that stood out was brighter pictures with high dynamic range (HDR).

Since the inception of moving pictures, viewers have sat in the dark to see motion pictures, and indoors for television. The peak white levels of television were limited by CRT technology, and for cinema, the projection of 35mm film.

Once flat screen displays ousted the CRT, technologies like backlit LCDs can achieve much higher brightness levels. Peak white in a movie theatre is around 40 nits, a CRT 100 nits or so. Smartphones can achieve 700 or more nits, and LCDs for daylight viewing, digital signage, can reach as high as 5,000 nits. The displays of today are much brighter than those around when television standards first appeared. Because legacy displays had limited brightness, and viewing conditions limited black level, television had a very limited dynamic range compared to the human eye.

Companies, notably Dolby, saw that brighter displays would allow a wider dynamic range to be reproduced, which in turn would create more life-like images. Any process reproducing real images has a constrained dynamic range. Printed paper, photographic prints, the projected image, or direct view displays.

Leading photographers like Ansel Adams formalised the process of viewing a natural scene, and visualizing where the tonal values would sit in the final print. Adams could manipulate exposure and development of the negative to achieve different contrast ratios. But ultimately the photochemistry of the film emulsion has limits to the brightest highlight versus the darkest shadow that can be recorded and reproduced on photographic paper.

Tube television cameras had a limited dynamic range compared with film, and the solution was to reduce the brightness range of the scene. Powerful lights flooded the shadows to create a limited range that could be captured by the equipment of the day.

Advances in camera technology, solid state sensors like CCD and CMOS, now means that the dynamic range of electronic cameras equals or even exceeds film negative—15 stops or so.

With the CRT dead, brighter displays, and wide dynamic range cameras, the scene was set to join up the dots and connect the two together for an imaging system with a wider dynamic range that better matches the range of the human visual system (HVS).

In part 2 of this series, coming soon, I look at the capabilities of the HVS.

You might also like...