With The NAB Show 2019 on the horizon, just for readers of The Broadcast Bridge, a top Sony expert helps us understand what will be behind their HDR display technology at this year’s techno-gala.
Now that football’s Super Bowl is in the books, it is time to look forward to our industry's annual Big Show, and Hugo Gaggioni, CTO for imaging products and professional solutions America at Sony, was good enough to give us a background explanation of some of the technologies that will be key to Sony’s 2019 presentations.
Nothing exists in our exploding world of video creation unless you can see it on a monitor, and HDR (High Dynamic Range) is on everyone’s horizon. But does one manage posting HDR in a production environment that has to output both standard definition and high definition programming simultaneously?
To begin with, Gaggioni warns us that to fully understand the challenge of high dynamic range, we need to differentiate between HDR for production and HDR for distribution.
“Our professional group deals with both HDR for live production material and HDR for file-based cinematographic applications,” he began. “We have been dealing with the live production of HDR in Europe and Asia for five years, where it is in much more demand than here stateside. The technology we use for live production is called ‘SR Live’, or ‘Scene Referred’ Live,
Put very simply a camera set to “scene referred” will record middle grey at 18% of peak luminance, but our eyes, and a “display referred” monitor, will present it at 45%.
Otherwise, if the display were completely linear, middle grey would look fairly black to our eyes.
Ever wonder why grey scale cards are so complicated? There is nothing simple about grey. Click to enlarge.
HDR on SDR
Our industry has a whole generation of veteran camera shaders who have perfected their skills on Standard Dynamic Range (SDR) monitors, and SDR is still the signal that pays the bills, because that is what most customers can receive.
“So broadcasters have been asking for a single workflow layer that simultaneously produces SDR and HDR,” Gaggioni said, “and we cannot afford to affect the SDR signal when creating HDR. That is why Sony developed an intermediate production process called S-Log3 which is the most appropriate OETF (Opto-Electrical Transfer Function) curve for the creation of an intermediate production format matching the sensors of our studio cameras. The utilization of S-Log3 in the production trucks and studio facilities enables a dual post production path."
That way, operators can look at a standard dynamic range monitor while processing everything in high dynamic range.
U. S. broadcasters actually lag behind Europe and Asia in the use of HDR
“Here in the U. S., all of the production trucks have standard dynamic range (SDR) monitors with Rec. 709 gamma as part of their standard equipment,” Gaggioni explained, “and the S-Log3 approach will not look correct in the standard Rec.709 monitors.”
So in response to requests from several networks, Sony created a hybrid HDR signal for their camera control units called HLG Live that will display properly on a Rec.709 monitor.
“This signal is in conformance to the International standard ITU-BT-2100 that describes the HLG HDR OETF,” he said, “with no need of production metadata, which accelerated its adoption in live production applications.”
At the NAB Show 2019 Sony will display the high-end BVM-X300 4K OLED master monitor that has all of the formats in its palette of display capabilities.
“We don’t use metadata in professional monitors at all,” Gaggioni said, “so it is simply a matter for the color grader to select the HDR to be graded for, from high end 4K Blu-Ray to various levels of streaming.”
Consumer HDTV’s don’t have that luxury. Therefore, at the output of the production chain for delivery, Sony has to convert to one of the two HDR standards in use internationally for distribution: PQ (Perceptual Quantizer), also known as SMPTE ST 2084, whose several variations require their own competitive implementations of metadata, and HLG (Hybrid Log Gamma), jointly developed by the BBC and NHK, which involves no additional metadata.
“Fortunately,” as Gaggioni explained, “whether the home display is produced by Sony, Samsung, LG, TLC, all of which are based on HDR PQ and HLG, each have their own internal circuitry that selects which metadata is needed for their specific brand of display, or even no metadata at all."
Sony’s Latest Flagship Monitor
This year at The NAB Show 2019, Sony will be showcasing their remarkable BVM-HX310 monitor.
This uncompromising critical evaluation display is designed to be the ultimate screen for critical color grading either on-set or in the studio, supporting wide color gamuts (WCG) including wide color gamuts including DCI‑P3, ITU‑R BT.2020, S‑Gamut3.cine and S‑Gamut3.
However, it should be noted that the BVM-HX310 does not cover the ITU-R BT.2020, S-Gamut/S-Gamut3 and S-Gamut3.cine color space in full, as it is the case with all displays with today's technologies.
The BVM-HX310 achieves 1,000 nits of full-screen brightness with 1,000,000:1 contrast ratio.
This monitor supports standardized Electro-Optical Transfer Functions for HDR such as SMPTE ST.2084 and ITU-R BT.2100 (HLG), along with additional EOTF tables for live and post production environments include 2.4 (HDR), S-Log2 (HDR), S-Log3 (HDR) and S-Log3 (Live HDR) while enabling workflows close to that of film while still delivering 4K wide dynamic range.
The Quad View Display mode of the BVM-HX310 permits instant comparison of up to four sets of custom display settings, including EOTF, Color Space, Transfer Matrix and Color Temperature, Contrast, Brightness and Chroma.
I hope this overview will enhance our appreciation for such technologies, which will be on display at NAB 2019.
You might also like...
Dealing with brightness in camera systems sounds simple. Increase the light going into the lens; increase the signal level coming out of the camera, and in turn increase the amount of light coming out of the display. In reality, it’s…
The human visual system (HVS) sees color using a set of three overlapping filters, which are extremely broad. As a result, the HVS is completely incapable of performing any precise assessment of an observed spectrum.
Over the century or so we’ve been making moving images, a lot of improvements have been dreamed up. Some of them, like stereo 3D and high frame rate, have repeatedly suffered a lukewarm reception. Other things, like HD, and e…
At one time the only repeatable source of light on Earth was the sun. Later it was found that if bodies were made hot enough, they would radiate light. Any treatment of illumination has to start with the radiation from…
Giving his unique view on NAB2019, Gary Olson considers and scrutinizes the big moving trends of consolidations, and casts clarity on the cloud, ATSC 3.0, and AI.