Integrating NDI And ST 2110 For Internet Streaming

The focus of much of the latest broadcast TV R&D is the Remote Integration Model (REMI). From millions of Skype meetings over consumer ISPs to the recent Winter Olympics TV broadcasts, REMI is significantly changing the internal dynamics of live, between-the-glass, remote TV production and viewing.

The digital race between higher video resolutions, frame rates, deeper bit depths and COTS computers and network speeds has been neck and neck since before NewTek announced its first Video Toaster in 1987. In terms of streaming IP video, the broadcast industry is constantly pushing the limits of opposite extremes simultaneously.

At one end of the spectrum is uncompressed IP video requiring 1GbE or better network bandwidth such as SMPTE ST 2110 and NewTek's royalty-free NDI protocol. At the other end is compressed video, such as h.264, HEVC, SRT, RIST, and RTMP, generally intended for home TV viewing and remote production venues with public ISP bandwidth issues and data caps.

Haivision released the Secure Reliable Transport (SRT) open source protocol at the 2017 NAB Show. SRT is supported in OBS Studio and VLC media player. Reliable Internet Stream Transport (RIST) was developed by Video Services Forum as a successor to SRT, intended to be an open alternative to proprietary technologies from Zixi, VideoFlow, QVidium, and DVEO. RIST Simple Profile was published in 2018. RIST Main Profile was published in March 2020. SRT and RIST were both developed to improve streaming internet video quality.

The >1 GbE and the <10 MbE IP video bandwidth requirement extremes of NDI and ST 2110 are both important to the future of broadcasting due to the strong markets for both.

Uncompressed IP Protocols
One of the first SMPTE high bit rate, uncompressed IP video specifications was ST 2022-6, published in 2012. It packetizes a full SDI raster and sends it to one IP destination. The foundation of the newer ST 2110 standards is based on ST 2022-6 and Video Services Forum (VSF) Technical Recommendation for Transport of Uncompressed Elementary Stream Media Over IP (TR-03), which VSF agreed to make available to SMPTE. The AIMS Alliance of manufacturers and engineers formed to support the standardization of TR-03 and ST 2110.

The bandwidth of the original SMPTE ST 2022-6 at 1080p50 is approximately 3 Gbps. The bandwidth of similar video signal processed in ST 2110 is approximately 30% less, or approximately 2.2 Gbps. The key to those lower figures is that only the active part of the image is sent in ST 2110. Both are uncompressed video.

ST 2110 uses Network Media Open Specifications (NMOS) to make 2110-based infrastructure manageable. The specifications describe how devices on a network can detect each other and available streams. Automating the connectivity configurations of devices in all environments makes system integration easier, but not as easy as NDI. 

The SMPTE ST 2110 suite includes several specific technical elements. ST 2110-20/21 is for video. It transports uncompressed video and prevents high-bitrate video streams from congesting IP networks 'pipes'.

ST 2110-30/31 is for audio. It transports uncompressed PCM and compressed or uncompressed AES3 audio over IP networks. ST 2110-40 maps ancillary data packets into Real-Time Transport Protocol (RTP) packets that are transported via UDP/IP and enables those packets to be moved in sync with associated video and audio essence streams, ST 2110-10 is for system timing. It uses PTP (IEEE 1588-2008) to distribute precise, GPS-referenced time over an IP network. PTP technology is used for synchronization and alignment of devices and media signals and types at any point along the broadcast chain.

ST 2110 benefits include flexible workflows that enable independent work on video, audio, and data streams.  It is agile because it isolates the essence flows and scales because it relies on IP infrastructure that does not need multiple network stacks. It removes the need to transmit and demux the other components not required for that portion of processing, which would add unnecessary latency.

Its also format-agnostic technology assures interoperability of multi-vendor IP solutions, and it uses bandwidth efficiently when transporting uncompressed video.

Network Device Interface (NDI) provides an industry standard for slightly compressed video transport across a GbE LAN, same as SMPTE 2110. The difference is that NDI was the vision and design of NewTek and is royalty-free to encourage its adoption across the industry. Much of NDI's robustness, as well as that of ST 2110, is affected by the network and switch.

The NDI encoding algorithm is resolution and frame rate independent, supporting up to 4K and beyond, as well as multi-channel, floating-point audio up to 16 channels and beyond. NDI also includes tools to implement video access and grouping, bi-directional metadata, tally, and more.

The newest suite of tools for NDI 5 is downloadable for free and it provides several items that make NDI more useful. Tools include NDI Studio Monitor, NDI Test Patterns, and NDI Screen Capture to make a computer screen a NDI source. It also includes NDI Access Manager to control who sees the NDI sources on the network, and NDI Audio Direct for connection to some digital audio workstations. It uses NDI Remote to share or receive a NDI source over the internet, and NDI Bridge, still in Beta, to make NDI and NDI|HX work across the internet. NDI 5 also provides NDI tools for Adobe After Effects, Premiere and VLC.

Five NDI 5 updates have followed since NDI 5 debuted in early June. The current NDI 5 version is 5.0.5. Work continues on NDI 5 Bridge.

New Compression Schemes
In a real world without universal >1GbE internet service, video compression can make video easier to transport on the public internet. Compression is about trade-offs, and compressed video degrades with the number of times it has been encoded and decoded. In many one-time REMI production budgets, compression can’t be avoided. Both NDI and ST 2110 have taken steps to provide compression when necessary.

NDI|HX is the latest version of compressed NDI which has been enhanced to broaden a whole new world of devices using built-in h.264 compression chips. NDI|HX will eventually allow more hardware devices to support the NDI standard. The spotlight of NDI|HX is the low bandwidth requirements. In addition to public internet connections, it allows users on simple Gigabit networks the ability to host multiple video streams without flooding the network with traffic.

NDI held a developer’s conference in July 2021 to announce the NDI Plus Developer Hub, a support portal tailored to the community of software developers who use NDI in their products. Many developers are facing the same kinds of challenges during their development process. NewTek wants to apply the same community-focused thinking that made NDI a global standard. The Hub allows developers to ask questions in a threaded format, moderated and answered by the NDI support team, and it creates a community around the NDI Developer Kit.

ST 2110-22 is a constant bitrate compressed video transport IP format that defines the key requirements for transporting compressed video essence. It provides a constant bitrate, a defined RTP payload and low latency to satisfy the needs of live TV production.

The majority of the SMPTE 2110-22 implementations uses the JPEG XS lightweight low latency compression standard. The XS stands for eXtra Small and eXtra Speed and it is supported by the TICO Alliance. According to TICO, “The JPEG XS mezzanine codec standard can be applied wherever uncompressed video is currently used.”

According to Evertz Microsystems, JPEG XS codec is visually lossless at a compression ratio of 8:1, and has only a few video lines of latency. The first implementations of JPEG-XS appeared at the 2019 NAB Show.

World Of New Solutions
Currently, NewTek has nearly 80 partners on its developer’s list. Many are introducing or have recently introduced new NDI products for a growing market of needs and budgets to bridge NDI and internet WANs. An example of this is AJA Video Systems BRIDGE LIVE with the latest v1.12 firmware which is available now with both NDI and HLS support .

BRIDGE LIVE is for encoding, decoding and transcoding in real-time to and from a range of Codecs including h.264, HEVC, MPEG2, MPEG-TS, JPEG 2000, and protocols like SRT, HLS, RTMP and UDP. It has 4x 12G-SDI I/O ports for signals up to one UHD stream for real-time transport. The recently released v1.12 features bi-directional NDI input, output and transcode, HTTP Live Streaming (HLS) output, video preview and UI updates. It also has dual built-in 10GbE network connections.

AJA Video Systems also makes solutions compliant with SMPTE ST 2110. Nick Rashby, President, AJA Video Systems recently said, “This is our first NDI-enabled product, and we couldn’t be more excited to bring the unique capabilities of BRIDGE LIVE to the huge range of NDI users around the world."

The worlds of SDI, IP, and streaming are rapidly merging. A tool like BRIDGE LIVE will massively simplify the conversions for this video transport, all in real time, opening a plethora of pipelines and workflows for video professionals across a wide range of verticals.

Supported by

You might also like...

Standards: Part 11 - Streaming Video & Audio Over IP Networks

Streaming services deliver content to the end-users via an IP network connection. The transport process is similar to broadcasting and shares some of the same technologies but there are some unique caveats.

Designing IP Broadcast Systems: Routing

IP networks are wonderfully flexible, but this flexibility can be the cause of much frustration, especially when broadcasters must decide on a network topology.

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Future Technologies: New Hardware Paradigms

As we continue our series of articles considering technologies of the near future and how they might transform how we think about broadcast, we consider the potential processing paradigm shift offered by GPU based processing.

Standards: Part 10 - Embedding And Multiplexing Streams

Audio visual content is constructed with several different media types. Simplest of all would be a single video and audio stream synchronized together. Additional complexity is commonplace. This requires careful synchronization with accurate timing control.