Now that we have resolved the professional media over managed networks aka JT-NM aka SMPTE ST2110 + NMOS challenge – What about all the field production, i.e. News, UGC, etc. that is beginning to use the wild and unruly Open Internet to send it back to the home office? Sending files back has different options, there are more than a few cloud options and accelerator solutions plus VPN and extended network.
Live production direct to stream over unreliable networks aka the Internet has been happening for a while and there are a few products in the market. In addition, there’s bonded cellular technologies that use proprietary encoding techniques across multiple carriers that has been around a bit and hopeful that 5G vastly improves its performance.
But there are no standards for live stream or essence over an open internet connection whether it’s wired or wireless.
The streamers have used a variety of formats - HTML5, ON2, Mpeg4/H.264 & H.265. However, these are more typically used to deliver content to a website or social platform. On the professional side, some are loosely based on one standard or another - SMPTE 2022-1 & 7, MPEG2/TS or IETF 3550 & 3551. But interoperability has not been a focus until now. And it’s certainly possible that this is new challenge was brought on by field production, news gathering (ENG/SNG) and UGC (User Generated Content), as they are appearing more and more in production.
Introducing the next competition in formats and standards. RIST (Reliable Internet Stream Transport) vs. NDI (Network Device Interface). So, the first thing to notice is that this is all about a four letter acronym competing with a three letter acronym. But that’s not all of it! Where RIST is focusing on transport of the Internet, NDI wants to compete with the whole JT-NM concept of in facility IP that includes extending to remote and field production all in the NDI ecosystem. So three might be larger than four in this instance.
Network Device Interface – NDI, this was developed by NewTek and is free to use for anyone interested in adopting it under a license agreement. It is included in all NewTek products and because of their significant install base, a number of vendors have been embedding NDI encoders & decoders in their products. The first question you might ask is what about SMPTE ST2110 and that’s a great question. NDI is a compressed format and ST2110 is largely an uncompressed set of formats. Where ST2110 maintains separate essence for each component, NDI multiplexes and highly compresses audio, video and metadata into a single 100Mb/s stream. Maybe there will be NDI/ST2110 and ST2110/NDI products around the corner at NAB 2020. But I digress. At 100Mb/s NDI can pretty much travel on any network and over the open Internet, there are a number of transport products using NDI as the input and output. At the output end if there is no NDI device then an adapter is required or if the NDI needs to end up as a file then an encoder is required.
Reliable Internet Stream Transport - RIST is pretty much what it says. There is a published document VSF Technical Recommendation TR-06 that states “This Technical Recommendation contains a protocol specification for reliable streaming over the Internet, so end users can mix and match solutions from different vendors.” It also states there will be multiple RIST profiles. The RIST protocols are largely based on existing SMPTE and IETF published standards and recommendations. RIST doesn’t specify which codec to use it’s not a format it’s a transport protocol. The connection between the sender and receiver needs to be established prior to transmission. That opens a series of questions on where RIST is embedded or is it on its own server/VM and injected.
In both protocols using them requires a programmer to develop something based on either the NDI SDK or using the RIST published simple protocol, currently, the only one available. So now the question becomes which of the numerous products available for using Internet delivery is the best one and are they reliable? Or is it just that there’s no standard.
One of the more confusing elements of the IP transition is a few inconsistencies. First the JT-NM Roadmap only covers “in facility” transport. Now that covers a lot of ground, sort of. Mobile production providers (Trucks/OB Vans) are technically their own facility. So, they can use ST2110 and NMOS. Then there’s the issue of getting live back to the studio and ST2110 isn’t for that either. So, we have an intermediate transport challenge. Even connecting two ST2110 facilities cannot use ST2110 in the connection. This touches not only the timing and sync between locations, the essence doesn’t transport outside of the managed core-network. Using one of the managed mesh network providers (ie LTN, L3 and The Switch) keeps the IP connection but it doesn’t honor ST2110 or NMOS. And historically J2K is the format of choice for this transport connection. However, field production, i.e. news and smaller single or multi-camera productions are not well suited for ST2110 and NMOS. This may be the place for other specifications like RIST and NDI, but it still leaves the question of interoperability between ST2110 when the stream lands back at the facility.
One of the topics consistently being left out of all these discussions is who is programming or developing the app that is based on RIST, NDI or even NMOS. These are not off the shelf plug and play applications. So where in the media supply chain do each of these fit? Are they independent API’s or embedded code on an existing application? Do they need their own server or VM? Are they embedded by a vendor and then manually configured by the end user?
As 5G starts gaining momentum, the use of open Internet will increase to backhaul, contribute or broadcast live. We certainly need some interoperable standard or specification that everyone can adopt with a sense of security it will be around for a bit.
You might also like...
We all understand what it means when someone says a video went viral. It typically means a person used a mobile device to record an event and posted it to any number of social media websites. How does that have…
It is almost a hundred years since the color space of the human visual system was first explored. John Watkinson looks at how it was done.
In a multi-disciplinary subject such as color space, it is hard to know where to start. John Watkinson argues that the starting point is less important than the destination.
As High Dynamic Range (HDR) and Wide Color Gamut (i.e.BT.2020) are increasingly mandated by major industry players like Netflix and Amazon, DOPs in the broadcast realm are under intense pressure to get it right during original image capture.…
Most people are aware that any color can be mixed from red, green and blue light, and we make color pictures out of red, green and blue images. The relationship between modern color imaging and the human visual system was…