NDI For Broadcast: Part 1 – What Is NDI?
This is the first of a series of three articles which examine and discuss NDI and its place in broadcast infrastructure.
Other articles in this series:
This is not a series supported by NDI and doesn’t seek to promote the use of NDI. NDI is a proprietary technology and a commercial product (NDI™ is a registered trade mark of Vizrt NDI AB) and as a rule The Broadcast Bridge leaves vendors to promote proprietary technologies and the use of specific products. NDI is also not a standard.
For the nine years or so since its release the go-to-market strategy for NDI seems to have been all about making it free and easy to use for end users, and making the basic features free for developers. It’s been a worthwhile tactic and has led to a massive amount of adoption, especially by AV, Houses of Worship, YouTube Streamers and other non-broadcast media production environments.
Like so many other proprietary technologies in the history of media production, NDI has also seen pragmatic adoption by a number of broadcasters despite its non standards-based approach, because it seems to offer some benefits. An increasing number of high-profile, large-scale broadcast vendors are also now incorporating NDI as a format that is worth exchanging essence and data with, alongside other, standards-based formats. Some vendors also seem to be using NDI as the IP infrastructure when creating products aimed at new non-broadcast markets.
NDI seems to have found its place in broadcast, so it feels timely for The Broadcast Bridge to discuss it. This series will describe the technology, its features, potential benefits and potential limitations. It will discuss how broadcasters are – and are not – using it, and its place in the wider broadcast infrastructure.
A Brief History Of NDI
The broadcast industry loves cables.
It loves video cables, audio cables, power cables, control cables, multicore cables, ethernet cables. But it doesn’t love their limitations; the sheer physicality of running so many different cables is one thing, but signal degradation over longer cable runs can be a crippling limitation, and running cables through couplers or extenders can ultimately make things worse.
NDI was developed as a way to do away with all these cables and use just one cable to deliver audio, video, metadata, and control signals across standard local and wide area IP networks. It aimed to do everything at a minimum 4K, with low latency and bidirectional transport, and to reduce the amount of physical hardware required to physically plug in all those signals.
The goal was also to make it as easy as possible, with no external syncs, IP degrees or head scratching.
That was the aim in 2015, at a time when broadcasters were still primarily reliant on SDI to move audio and video signals. Making its debut at IBC that same year, NewTek’s Network Device Interface (NDI) began as a way to enable NewTek’s customers (now Vizrt’s) to increase the number of inputs into the company’s software-based TriCaster media-suite without having to invest in a load more physical BNC connectors and cabling.
Designed to send audio, video, and metadata signals over standard IP networks in real-time, it used existing LAN network connections to enable different multimedia systems to communicate with each other over a standard ethernet cable, and encode, transmit, and receive frame-accurate video and audio along with associated metadata.
It also had no need to rely on Precision Time Protocol (PTP) for synchronization, which made it quick to deploy and presented a relatively simple way to use IP infrastructure. On the receiver side a user just needed to look for the source and tune it in; there was no control layer, and users could make use of NDI without even knowing the IP address of the sender.
Growing The Ecosystem
NDI made its software development kit (SDK) freely available to encourage hardware and software vendors to grow a wider interoperable ecosystem. The basic SDK remains free for vendors, but deploying the full feature set of NDI requires use of the ‘Advanced SDK’ for which vendors pay license fees. Offering a simple way for devices to find each other on a standard IP network, its ability to control devices like PTZ cameras and the simple promise of interoperability between different vendor devices (providing the vendor has implemented NDI of course) meant it didn’t take long for developers to get on board. The result has been steady growth in a fairly diverse ecosystem of NDI-compliant hardware and software devices.
Software-based video switcher vMix had signed up by NAB 2016, and later that year Magewell announced support for its PCIe capture cards. In June 2017, a free third-party NDI plugin for OBS Studio was released on Mac, PC and Linux computers, and tools for Adobe and VLC followed, as did integration with online meeting packages like Zoom and Teams. That same year, NDI 3.0 added support for NDI HX (more on that later) and multicast, encouraging PTZ camera companies like PTZOptics to get on board. Sony, Panasonic, and Canon all now offer NDI-enabled PTZ cameras.
NDI says there are now more than 600,000 NDI-enabled devices and its website features an ecosystem of equipment ranging from audio, communications and control systems to switchers and varifocal cameras.
Not A Standard
As most will be aware, NDI wasn’t the first IP video specification to appeal to broadcasters. SMPTE 2022 pre-dates NDI by eight years. Launched in 2007, ST 2022 was developed specifically as an IP transport standard, and by 2012 ST 2022-6 had added a method for carrying uncompressed video. Eventually rolled into the ST 2110 suite of standards, it was (and still is) a very different proposition to NDI.
One of SMPTE 2110’s fundamentals is that it separates video, audio, and data into separate streams, and it supports PTP for the synchronization and alignment of everything in the broadcast chain. It adheres to specific standards for each of these things; ST 2110-20/21 for transporting uncompressed video, ST 2110-30/31 for uncompressed audio, and ST 2110-40 for the mapping of ancillary data packets into Real-Time Transport Protocol (RTP) packets. It uses ST 2110-10 for system timing, and it relies on NMOS for discovery and management of signals.
SMPT ST 2110 is focused on uncompressed video over IP, so remains the approach of choice for broadcasters demanding the highest quality, and its format-agnostic approach demands standards-based interoperability between different equipment manufacturers.
On the flip side, ST 2110 is complex and it requires a high bandwidth network infrastructure to operate. And in 2020, when world events forced broadcasters into rethinking their traditional workflows, NDI's simplicity, capacity for simple remote operation and lower network bandwidth requirements (thanks to compression), brought it onto broadcaster’s radar.
Growth In Broadcast
Although early adoption was initially driven by individual users experimenting with NDI, its suitability for remote workflows during the pandemic hugely accelerated interest from broadcasters as they scrambled to get live content to air in hastily constructed distributed production environments. The fact that NDI had been designed to work in a local area network meant it was a good fit for cloud-enabled workflows, and while today broadcasters are still not adopting NDI as core infrastructure for widespread tier one programming, it is still reportedly regularly used for live contribution feeds – especially in news.
One of NDI’s strengths in distributed environments is down to its range of compression formats. Compression is all about trade-offs, and the broadcast industry often seems to operate between extremes of video streaming bandwidth. Where uncompressed IP video like 2110 requires a huge amount of bandwidth, compressed video formats like H.264, HEVC and SRT can be employed over more congested paths, like public internet. NDI works with two of these compression codecs as well as a proprietary codec of its own to accommodate varying bandwidth capabilities.
Codecs For Every Eventuality
NDI Full Bandwidth (released in 2016) uses NDI’s proprietary SpeedHQ codec which is based on MPEG-2 and delivers lossless video transmission. It uses 130 Mbps of bandwidth at 1080p60, and its inevitable trade-off is the need for a high-speed network due to the amount of bandwidth required.
More suited to remote and distributed broadcast workflows, NDI HX2 (released 2019) and HX3 (released 2022) follow on from NDI HX (2017) and use AVC (H.264) and High Efficiency Video Coding HEVC (H.265) codecs to deliver image quality using minimal network bandwidth. HX3 delivers better performance than HX2, taking up 62 Mbps (H.264)/50 Mbps (H.265) of bandwidth at 1080p60.
HEVC works well for NDI. It was developed to enhance the capabilities of previous coding standards like H.264, particularly when supporting higher resolutions such as 4K and 8K video content. It can achieve significant reductions in file size while maintaining the quality of the original video and can handle resolutions of up to 16K.
This illustrates one of the other side effects of the paradigm shift demanded by the pandemic. This accelerated adoption of remote production methodologies ultimately required compression, but after initially perhaps turning to NDI, broadcasters now have the option to use a range of different compression technologies in conjunction with ST 2110, and as things stand standards still prevail in core infrastructure.
Interoperability & Backwards Compatibility
One of the key concepts of NDI has been simple interoperability – that all NDI enabled devices will connect and work together. As the broadcast industry knows all too well, as the years roll by, backwards compatibility can become more challenging to sustain. It is one of the universal truths of broadcast.
While NDI has been careful to ensure HX2 and HX3 are interoperable with each other and have ensured that new versions of NDI can be backwards compatible with previous versions like NDI HX, it seems the decision about which versions to support is in the hands of the wider ecosystem. It appears that individual vendors have the freedom to choose whether to implement backwards compatibility, and vendor choice means that there is always the possibility of leaving some older gear outdated. This, once again, is another reminder that NDI is not a standard. Standards take much more development time and effort, but interoperability is always a given.
Tools and Future Development
In less than ten years NDI has gone from being a method of increasing input capacity for a specific product, to a technology that is used across multiple content channels, from bedroom streaming to international news broadcasting.
In the next two articles we’ll look at how NDI’s expansive tool kit aims to grow more adoption with a range of monitoring, routing, camera, and test applications. We will also look at NDI Bridge, an ongoing development that allows devices on different networks to exchange data and video streams as if they were on the same local network.
We’ll also look at how NDI Audio aims to bring the same agnostic and simplistic approach to audio networks, transporting multiple audio channels on one single NDI stream.
You might also like...
Designing An LED Wall Display For Virtual Production - Part 2
We conclude our discussion of how the LED wall is far more than just a backdrop for the actors on a virtual production stage - it must be calibrated to work in harmony with camera, tracking and lighting systems in…
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.
Expanding Display Capabilities And The Quest For HDR & WCG
Broadcast image production is intrinsically linked to consumer displays and their capacity to reproduce High Dynamic Range and a Wide Color Gamut.
NDI For Broadcast: Part 2 – The NDI Tool Kit
This second part of our mini-series exploring NDI and its place in broadcast infrastructure moves on to exploring the NDI Tools and what they now offer broadcasters.
HDR & WCG For Broadcast: Part 2 - The Production Challenges Of HDR & WCG
Welcome to Part 2 of ‘HDR & WCG For Broadcast’ - a major 10 article exploration of the science and practical applications of all aspects of High Dynamic Range and Wide Color Gamut for broadcast production. Part 2 discusses expanding display capabilities and…