Haivision Upgrades Low Latency Video Encoder

Haivision has upgraded its flagship ultra-low latency video encoder, the Makito X4, which incorporates the SRT (Secure Reliable Transport) protocol it invented and then open sourced.

This is the third major update since its launch and combines software defined features with a dedicated hardware platform to shave up to 25% off latency, with frame accurate multi-stream delivery for remote broadcast transmission over the IP protocol.

“Today’s broadcast networks depend on the transmission of real-time video from the field to production centers or cloud production services even when faced with limited bandwidth,” said Peter Maag, Haivision’s CMO. “By squeezing the highest picture quality out of every bit transmitted and enabling creatives to produce multiple streams in real-time, Haivision is addressing this core industry challenge.”

The specific improvements in the newest release of the Makito X4 encoder include enhanced video quality through support for High Dynamic Range (HDR) and Wide Color Gamut (WCG) encoding for video contribution. There is also support for the HEVC codec over the RTSP protocol, which again can natively reduce bandwidth further.

Latency is also lowered by up to 25% further by exploiting the latest slice-based encoding techniques. Slicing is a technique introduced with the H.264/MPEG-4 codec and enhanced subsequently, where frames are divided into spatially distinct regions called slices that can each be encoded separately in parallel, reducing processing time. There is also improved timing precision, enabling synchronized multiple-camera streaming for real-time on-premise or cloud-native live broadcast production.

The stage was set for this upgrade by publication of Haivision’s second report on the state of IP and cloud adoption over a month ago, two years after the first in 2019. Both surveys reported transitioning to IP as the greatest broadcast challenge, cited by 42% as a major issue in the second report. However, this time that was almost matched by the challenge of enabling remote collaboration, cited by 41% and reflecting the impact of the global Covid-19 pandemic. As Haivision’s Content Marketing Manager Lina Nikols pointed out in a blog, this most likely reflects the speed at which workflows for newly distributed workforces had to be designed and adapted.

Original REMI (Remote Integration Model) and at-home live production models were designed in order to send less personnel and equipment to the field and instead concentrate talent and production resources at a central location, Nikols added. “Although the field contribution component is still relevant, production is no longer centralized as production staff, editors, directors, operators, and on-screen talent remain at home yet still need to be able to collaborate in real-time.”

IP video streaming combined with software-defined and cloud-based video production tools has enabled broadcast professionals to continue to do their job no matter where they are, Nikols asserted. “IP video streams can be used for all types of live production workflows including broadcast contribution, return feeds, bi-directional interviews, and broadcast monitoring. As a result, production facilities around the world are being decentralized.”

Haivision is most famous for the SRT protocol that is now managed by the SRT Alliance, whose advisory group comprises Microsoft, Telestream and Avid, supported by numerous major lower tier members such as Comcast, AWS Elemental, Alibaba Cloud, Harmonic, MediaKind, Wowza, Bitmovin and Grass Valley.

You might also like...

Learning From The Experts At The BEITC Sessions at 2023 NAB Show

Many NAB Shows visitors don’t realize that some of the most valuable technical information released at NAB Shows emanates from BEITC sessions. The job titles of all but one speaker in the conference are all related to engineering, technology, d…

Interlace: Part 3 - Deinterlacing

Now that interlace is obsolete, we are left only with the problem of dealing with archive material that exists in the interlaced format. The overwhelming majority of video tapes, whether component or composite, analog or digital, would be interlaced.

Compression: Part 6 - Inter Coding

The greatest amount of compression comes from the use of inter coding, which consists of finding redundancy between a series of pictures.

Celebrating BEITC At NAB Show

As we approach the 2023 NAB Show in the NAB centenary year, we celebrate the unique insight and influence of the Broadcast Engineering & IT Conference that happens alongside the show each year.

Waves: Part 6 - Wave Fronts

Refraction is a topic that is at the heart of waves of all kinds. It affects the broadcaster in many ways, in lenses, optical fibers and in the way transmissions propagate.