Haivision Upgrades Low Latency Video Encoder

Haivision has upgraded its flagship ultra-low latency video encoder, the Makito X4, which incorporates the SRT (Secure Reliable Transport) protocol it invented and then open sourced.

This is the third major update since its launch and combines software defined features with a dedicated hardware platform to shave up to 25% off latency, with frame accurate multi-stream delivery for remote broadcast transmission over the IP protocol.

“Today’s broadcast networks depend on the transmission of real-time video from the field to production centers or cloud production services even when faced with limited bandwidth,” said Peter Maag, Haivision’s CMO. “By squeezing the highest picture quality out of every bit transmitted and enabling creatives to produce multiple streams in real-time, Haivision is addressing this core industry challenge.”

The specific improvements in the newest release of the Makito X4 encoder include enhanced video quality through support for High Dynamic Range (HDR) and Wide Color Gamut (WCG) encoding for video contribution. There is also support for the HEVC codec over the RTSP protocol, which again can natively reduce bandwidth further.

Latency is also lowered by up to 25% further by exploiting the latest slice-based encoding techniques. Slicing is a technique introduced with the H.264/MPEG-4 codec and enhanced subsequently, where frames are divided into spatially distinct regions called slices that can each be encoded separately in parallel, reducing processing time. There is also improved timing precision, enabling synchronized multiple-camera streaming for real-time on-premise or cloud-native live broadcast production.

The stage was set for this upgrade by publication of Haivision’s second report on the state of IP and cloud adoption over a month ago, two years after the first in 2019. Both surveys reported transitioning to IP as the greatest broadcast challenge, cited by 42% as a major issue in the second report. However, this time that was almost matched by the challenge of enabling remote collaboration, cited by 41% and reflecting the impact of the global Covid-19 pandemic. As Haivision’s Content Marketing Manager Lina Nikols pointed out in a blog, this most likely reflects the speed at which workflows for newly distributed workforces had to be designed and adapted.

Original REMI (Remote Integration Model) and at-home live production models were designed in order to send less personnel and equipment to the field and instead concentrate talent and production resources at a central location, Nikols added. “Although the field contribution component is still relevant, production is no longer centralized as production staff, editors, directors, operators, and on-screen talent remain at home yet still need to be able to collaborate in real-time.”

IP video streaming combined with software-defined and cloud-based video production tools has enabled broadcast professionals to continue to do their job no matter where they are, Nikols asserted. “IP video streams can be used for all types of live production workflows including broadcast contribution, return feeds, bi-directional interviews, and broadcast monitoring. As a result, production facilities around the world are being decentralized.”

Haivision is most famous for the SRT protocol that is now managed by the SRT Alliance, whose advisory group comprises Microsoft, Telestream and Avid, supported by numerous major lower tier members such as Comcast, AWS Elemental, Alibaba Cloud, Harmonic, MediaKind, Wowza, Bitmovin and Grass Valley.

You might also like...

Designing IP Broadcast Systems: Routing

IP networks are wonderfully flexible, but this flexibility can be the cause of much frustration, especially when broadcasters must decide on a network topology.

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Future Technologies: New Hardware Paradigms

As we continue our series of articles considering technologies of the near future and how they might transform how we think about broadcast, we consider the potential processing paradigm shift offered by GPU based processing.

Standards: Part 10 - Embedding And Multiplexing Streams

Audio visual content is constructed with several different media types. Simplest of all would be a single video and audio stream synchronized together. Additional complexity is commonplace. This requires careful synchronization with accurate timing control.

Designing IP Broadcast Systems: Why Can’t We Just Plug And Play?

Plug and play would be an ideal solution for IP broadcast workflows, however, this concept is not as straightforward as it may first seem.