Cinegy Explains Its 16K Video Codec

​Moore’s Law has finally hit the broadcast industry where IT technologies are taking over and making 4K a breeze. At the IBC 2015 show in Amsterdam it was equally apparent that working in 8K for postproduction and even broadcast streaming is inevitable, with tools from cameras to editing software and HEVC encoding already primed or in development to handle it. No-one, though, was envisioning a world beyond 8K—except German software engineer Cinegy. It demonstrated a codec that it claimed could decode a Hollywood movie in a second and is built to manage 16K data rates today.

"We realised that existing codecs are pretty much useless in environments beyond 4K," explained Cinegy CEO and co-founder Jan Weigner. "You need a ridiculous box of kit to service just a single channel. Yet the industry is already talking 8K, and the bandwidth of existing devices that we can envision in 4-5 years wouldn't accept that.

"So we decided to confront this problem by designing a mezzanine codec for acquisition and production from scratch which would render this vision today with existing off-the-shelf hardware. This is the only way to play professional-quality 8K streams on commodity hardware or even a consumer laptop today."

Jan Weigner, CTO, Cinegy

Jan Weigner, CTO, Cinegy

The DANIEL2 codec can decode up to 1100 frames per second at 8K (7680x4320, or 16x HD resolution) which translates into over 4,300 fps in 4K or 17,000 frames of full HD per second. It is specified to perform 16K at 280 fps. The compression ratio is stated as 1:10 to 1:20 working with 8K.

"The performance secret is that this is architected and developed from the ground up to be GPU-based," explained Weigner. "It is very conservative with GPU memory bandwidth, leaving compute resources for other tasks."

The demo at IBC showed the codec decoding multiple 8K streams along with multiple 4K streams while performing realtime compositing, colour correction, scaling and titling with the results displayed in realtime in 8K. The hardware platform used was a Intel quad core i7-67000K processor and an Nvidia GTX980Ti or Quadro M6000 graphics card.

"A problem faced when designing 4K, 8K—or soon 16K systems—that need to handle multiple streams and that need to manipulate them in real time, is that even if you could decode the streams using the CPU—which you cannot—then you'd probably still want to use the power of the GPU for effects and filters," explained Weigner. "Now you face the bottleneck of the system bus to transfer the decoded streams into the GPU's memory.

"This where DANIEL2 comes in," he continued. "Streams a fraction of the size of their uncompressed counterparts are read from disk or via the network and passed to the GPU to be decompressed faster than the uncompressed frames can be copied. So we can achieve less bandwidth of the system bus being used, less space or bandwidth consumed on disk or the network. I could decode dozens of 8K streams and still have enough power left for all the video processing like chroma keying and effects. This power means I can work with dozens of 4K stream on a laptop today."

Cinegy may be targeting its DANIEL2 codec at cloud applications. Said Cinegy CTO Weigner,

Cinegy may be targeting its DANIEL2 codec at cloud applications. Said Cinegy CTO Weigner, "I can spin up a channel from AWS not in days, hours, or minutes, but in seconds."

DANIEL2's main use is for recording from camera sources, editing, and postproduction as well as playout. "We have had interest from camera manufacturers particularly where slow motion cameras need to capture hundreds or thousands of frames a second," said Weigner.

"We are aiming for the same space as AVID DnxHR, Apple's ProRes, or Sony XAVC," he said. "We could put this in a MXF wrapper and standardise it. We are not after the HEVC distribution codec. DANIEL2 could go all the way to playout where finally you turn the stream into a distributed channel and H.264 and HEVC can kick in."

The first generation DANIEL codec was developed with the specific purpose of being an RGBA codec—the A standing for alpha. "The aim was to provide a better, easier way to deal with video with alpha mask for overlays and keying," said Weigner. "This can be done with other codecs like ProRes or DnxHD but these always consume a fixed bitrate even if there is actually not much to encode. We found people were using the DANIEL codec for other purposes such as 4K encoding and playback as it is much lighter on the CPU than comparable codecs."

This, he said, prompted Cinegy to develop a GPU focussed second iteration. DANIEL2 is being made available as an SDK as well as AVI and Quicktime codecs to permit integration with Adobe Premier, After Effects, Avid Media Composer, Vizrt and other popular applications.

"Eventually we are looking at powering this with a server the size of a cigarette box," he said.

The Munich-based developer company's messaging at IBC targeted Imagine Communications. "Don't Imagine Cloud Playout—It's Real," screamed the posters.

"Two years ago Imagine had no solutions in this area at all—they had to completely rewrite everything," said Weigner. "We have been doing cloud playout for years. I can spin up a channel from AWS not in days, hours, or minutes, but in seconds."

Its Cinegy Air PRO provides a broadcast automation front-end and a real-time video server for SD, HD and/or 4K playout in an integrated software suite.

Weigner proceeded to demonstrated playout of a video encoded in H.264 using Nvidia hardware launched from AWS and streamed back to the Cinegy booth in, indeed, a matter of seconds.

You might also like...

Microservices For Broadcasters - Part 2

This is the second instalment of our extended article exploring the use of Microservices.

Digital Audio: Part 11 - Digital Dither

It should constantly be borne in mind that although digital audio is a form of data, those data represent an audio waveform and there are therefore some constraints on what can and cannot be done to the data without causing…

The Sponsors Perspective: How Does Flexible Access Change The Way Vizrt Serves Customers?

Flexible Access is our way of responding to customer needs we see in the market. At a high level, it is about putting our customer experience first and foremost. In these uncertain times, our customers need to produce more stories,…

The World Of OTT (Content Pt4) - Personalizing Content For Maximum Impact

A core promise of OTT is greater customer satisfaction through a more personalised viewing experience. Instead of linear channels with scheduled content that we may or may not be interested in, OTT enables us to combine tailored VOD and Live…

Microservices For Broadcasters - Part 1

Computer systems are driving forward broadcast innovation and the introduction of microservices is having a major impact on the way we think about software. This not only delivers improved productivity through more efficient workflow solutions for broadcasters, but also helps…