The Sponsors Perspective: Super FPGAs - A Cost-Effective Solution To Simultaneous HDR/SDR Production

The emergence of high dynamic range (HDR) acquisition in live and studio production has brought numerous possibilities for improved picture quality and creating an immersive experience for viewers. But it has also added cost and complexity to production workflows that broadcasters and content distributors are not eager to pay.

This article was first published as part of Essential Guide: Live HDR Down-Conversion

Consumers are increasingly seeking HDR videos on Netflix or similar streaming video providers, thereby prompting content producers to seek less costly ways to create more HDR content.

Many of the broadcasters’ OB fleets are now 4K UHD capable and have been used in a number of broadcasts, but many professionals are still looking for cost-effective and practical ways to simultaneously shoot and distribute 4K UHD and standard-definition (SDR) content with HDR attributes. Indeed, another major challenge when implementing HDR is to maintain good backward compatibility with existing SDR displays and receivers. That’s what most consumers are watching TV on.

Therefore, to support the overwhelming number of SDR TVs in the world, many productions have tested deploying separate but simultaneous workflows, shooting in SDR as well as in HDR. Besides the extra cost for dedicated HDR crew and equipment, having two paths for the same content is a challenge for the technical director that operates the OB production switcher, who now has to adjust the image for both audiences using a single camera Iris control. Producers also face difficulties when they have to produce content in multiple color spaces simultaneously.

In addition, those who produce in HDR and need to deliver both HDR and SDR content have struggled with the plethora of hardware necessary to deliver the two video streams.

Native HDR Workflows

There are several production methods that can be used to deploy HDR but one workflow that is gaining traction is to produce content in a native HDR workflow, whereby all the switching accomplished in HDR and the signal is later down-converted to SDR to feed legacy displays. Most of the graphics or logos would be done in the SDR, which would then be upconverted to HDR (without any color changes) to be used in the HDR program stream. This will ensure the color of the graphics on both SDR and HDR signals remains the same.

This native approach needs to be carefully thought out. Looking to reduce the complexity and number of external devices required for two separate signal paths, Lynx Technik has developed a new FPGA hardware-based general-purpose audio and video processing platform that can be used to perform many different functions using a series of signal processing components (called “constellations”) in different configurations—depending upon the application at hand.

The greenMachine multipurpose 4K/UHD or 3G/HD/SD quad channel video and audio processing platform.

The greenMachine multipurpose 4K/UHD or 3G/HD/SD quad channel video and audio processing platform.

The greenMachine

This platform is called the greenMachine and it allows customers to configure the hardware for a specific application for as long as it is needed and then reconfigure it for a different purpose. This allows the customer to have one piece of hardware that can be used for multiple purposes and is launched with just a simple mouse click for different application. Specially designed greenGUI software, available for both Windows and Mac operating systems, can control an entire greenMachine system from a central location.

The greenMachine hardware is customer configured, using one of the pre-defined constellations. A constellation is a template or package of features. The constellation can also be defined as a factory-configured combination of processing tools such as HDR, Frame Sync, Up down cross converter, embedding-de-embedding, and color adjustments.

For example, the greenMachine titan hardware constellation processes four 3G/HD/SD-SDI video streams or a single 4K/UHD video input. It supports up to 12G processing (3840 x 2160 UHD @60Hz) and can convert between single-link 4K video (12Gps) and quad-link 4K video (2SI, 4x 3G).

By installing different constellations (configuration of processing features) the greenMachine can be used for many different applications.

By installing different constellations (configuration of processing features) the greenMachine can be used for many different applications.

Look To The Constellations

Two other greenMachine constellations are HDR static and HDR Evie (Enhanced video image engine), which are used for handling high dynamic ranges and color gamuts. This presents viewers at home with more dynamic images than previously seen, even without an up-to-date HDR display.

The HDR Static and HDR Evie constellations help streamline HDR workflows. HDR Static is recommended for video productions where lighting conditions are not dynamically changing and remains constant. It is best suited for the studio or indoor environment. HDR Evie on the other hand is as good as Static HDR in a studio environment but gives amazing results for outdoor environments where the lighting conditions are dynamically changing; such as in outside broadcasting.

The two solutions have their own use cases; with HDR static being capable of SDR to HDR conversion along with HDR to SDR, while HDR Evie is one of the world’s first frame-by-frame HDR-to-SDR converters. HDR Evie applies color and contrast parameters to each frame. The two constellations also come with features like video adjustments, embedding/de-embedding, audio processing and shuffling.

Hardware Outperforms Software Processing

While many have considered software-centric systems that can process signals and perform tasks in parallel, Lynx Technik suggests that a software-centric approach is often expensive and time-consuming. In live production, the content needs to be distributed in real-time with no delay. HDR/SDR processing requires high computational speed with no latencies. Software running on GPUs offer slower processing speeds than FPGAs. And FPGAs have high cache memory that reduces the memory bottlenecks associated with external memory access.

The greenMachine hardware devices use the latest high-speed programmable Xilinx technology and dual ARM processors. The overall delay in processing HDR-SDR video is one frame in both the HDR Static and the HDR Evie constellations. This makes it an ideal solution for simultaneous SDR/HDR live event production.

Adding to the flexibility of the greenMachine for customers, the constellations are sold as licenses that can be activated or scaled back as needed. When more than one greenMachine is connected on a network, the license can be shared among several greenMachines. However, only one license can be activated on one machine at a time. This licensing model has proven to be invaluable to content distributors in other areas of the content chain and looks to have equal power for live production.

The Future Of HDR In Live Production Is Certain

There’s no doubt that HDR delivers a “wow” factor when it comes to live production, such as sports. Some think an HD HDR image looks equal to a 4K UHD picture. That’s why most experts expect to see more HDR video projects going forward, especially HD HDR since operators can reuse their already deployed 3G SDI infrastructures. Due to the extra cost involved, larger mobile production companies will move to UHD HDR sooner than smaller firms, but it is coming to projects of all sizes. It’s only a matter of when business models are made to combine with technology strategies in a practical way.

Supported by

You might also like...

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Future Technologies: New Hardware Paradigms

As we continue our series of articles considering technologies of the near future and how they might transform how we think about broadcast, we consider the potential processing paradigm shift offered by GPU based processing.

Standards: Part 10 - Embedding And Multiplexing Streams

Audio visual content is constructed with several different media types. Simplest of all would be a single video and audio stream synchronized together. Additional complexity is commonplace. This requires careful synchronization with accurate timing control.

Designing IP Broadcast Systems: Why Can’t We Just Plug And Play?

Plug and play would be an ideal solution for IP broadcast workflows, however, this concept is not as straightforward as it may first seem.

Future Technologies: Private 5G Vs Managed RF

We continue our series considering technologies of the near future and how they might transform how we think about broadcast, with whether building your own private 5G network could be an excellent replacement for managed RF.