Field Report: The day we used the public internet for backhaul on a budget.

Public internet-based remote TV production adds new critical monitoring points to avoid digital cliffs.

Over the first 75% of live TV and remote TV production history, broadcast gear was connected by coax and audio cables. Backhaul was via 2 GHz BAS microwave or satellite. Before electronic menus, engineers carried shirt-pocket screwdrivers, and interfacing analog remote broadcast production gear was the original plug-and-play. The connection simplicity gave engineers the time to focus on optimizing technical quality.

IP connections at remote production locations have become the new engineering priority because each one represents a digital cliff that could become a lost source. The time to monitor and tweak source elements and settings for the best possible visual and sound quality has become the other remote IP video latency. On-location engineers can be distracted by keeping all IP sources and destinations on-line and talking with each other.

The Digital Century began with an awesome paradigm shift for TV viewing, broadcasting, and live remote production, starting with the transition from analog SDTV to DTV and HDTV. Since Y2K, TV production tool access and convenience, data speeds and image quality have exploded while necessary equipment costs have virtually collapsed.

Except for the differences between professional HDTV broadcast lenses and built-in smartphone lenses, today’s common imaging, storage and display pixels and bits are at a level high above anything ever seen at a pre-DTV NAB Show. In the meantime, average TV viewers are watching perfect pictures on HD and UHD flat-screens of all sizes.

What is RTMP?
When H.264/AVC video transport and bonded high-speed wireless internet data connectivity became suitable for broadcast video use in about 2007, the new combination launched Dejero, LiveU, and TVU Networks into the professional bonded cellular streaming video business, before Apple released HTTP Live Streaming in 2009.

Digital progress and a huge push from the 2020 pandemic suddenly made streaming video over the public internet the new foundation of business meetings and local TV reporting. Nearly all streaming video requires a server for viewer control of the individual stream. Video over IP isn’t as simple as an Apple FaceTime call, but it can come close with the right software.

Real-time Transport Protocol (RTP) was first published in 1996 for video telephones and video teleconferencing systems to establish user control of streaming media server sessions between endpoints, for user control of play, pause and record. Real-time Streaming Protocol (RTSP) was published the same year, designed to control streaming media servers between endpoints. Most RTSP servers use RTP in conjunction with Real-time Control Protocol (RTCP) for media stream delivery.

Real-time Messaging Protocol (RTMP) also began in 1996 as a TCP-based protocol developed by Macromedia for low-latency computer graphics and animation streaming between a Flash player and a server. Adobe later released an incomplete version of Flash Video for public use in 2010 that has become well-accepted. RTMP encodes a RTP protocol IP signal designed for end-to-end, real-time transfer of streaming media. RTMP or RTMP over a Transport Layer Security (RTMPS) is what most bonded cellular and common internet platforms including YouTube and Vimeo typically use. 

This YouTube stream capture was a bit over-compressed. The 1080i LiveU Solo and satellite feeds were studio quality.

This YouTube stream capture was a bit over-compressed. The 1080i LiveU Solo and satellite feeds were studio quality.

Streaming without a Server?
There are many excellent streaming encoders and decoders available for premium prices that work well. There are also several PC- and Mac-based methods to backhaul a signal from one local ISP modem to another, anywhere across the internet.

I recently engineered the LakeRace 2021 broadcast, 6-hours of live offshore powerboat racing coverage carried on nearly 100 TV stations and 300 cable stations, distributed via satellite and the internet. We wanted to send a solid RTMP stream from our studio control room over the public internet to our IP distributor. For that, we rented a LiveU Solo from New Jersey’s Bergen A/V, who set up the connection between the Solo with the stream distributor’s RTMP server before shipping it to us. We plugged the Solo directly into our ISP's modem/router. It performed perfectly out of the box, connected to the RTMP server with the touch of one button, and remained rock solid.

We also wanted to backhaul a remote field camera source across the internet to our temporary studio to mix with other field cameras connected by fiber or Ubiquiti Wi-Fi. Backhauling a field camera from a private modem connection across the internet without a significant delay turned out to be impossible with our gear on hand. The Solo only streams to a RTMP server and we didn’t have another Solo or time to build and test our own RTMP server.

The NewTek Spark Plus 4K appeared to be the best internet backhaul option within our budget, and it was recommended to us by a couple of people. We bought a Spark Plus 4K and it tested perfectly on our LAN. The NDI 4 Tools and GUIs were powerful, but not powerful enough.

Moving Spark Plus video across the public internet with NDI 4 Tools, even all within one single local ISP, turned into mission impossible. IT experts from two local ISPs worked together all day at our temporary studio to find a work-around or overcome the obstacles, but nobody could make it happen. We tried Forti Client to build a VPN. We tried other programs to manage and modify firewalls and ports in cable modems and PCs. We tried using RTMP Tunneled (RTMPT) over HTTP. Nothing worked.

As we were hacking and hoping, NewTek announced NDI 5 Tools, which is said to have the ability to easily stream across the public internet. When our ISP IT experts heard that, they gave up. NDI 5 Tools is supposed to be released later this month. We’re on the list to test it, find out what works and write a story for The Broadcast Bridge about what we learn.

Potential Candidates
There are several methods to move high-quality video point-to-point across the internet. Most, such as Vimeo, YouTube, Facebook, Wowza and others offer easy access but had too much delay or were beyond our budget. Having one of six on-course cameras delayed by more than about one second during 100MPH+ racing action could confuse viewers paying attention.

One viable candidate was the Teradek Cube. It was a bit above our budget, but it would have worked. The Cube uses the Core cloud server to create a RTMP stream that can be displayed and output to HDMI with VLC Media Player. The Core cloud server has about a 300 ms delay.

The no-server candidate that was highly recommended was the Blackmagic Design ATEM MiniPro live production switcher and ATEM Streaming Bridge. Unfortunately, we only had an ATEM Mini on hand, which wouldn’t work with the ATEM Streaming Bridge. An ATEM MiniPro would have provided a solution.

The ATEM Streaming Bridge will only accept a stream from a Blackmagic Design direct streaming product. An ATEM Mini Pro, ATEM Mini Pro ISO, ATEM Mini Extreme, ATEM Mini Extreme ISO or Web Presenter HD must be on site at the point of origination to create the stream. At the receive site, the Streaming Bridge can operate while connected directly to the internet via Ethernet. Once the Streaming Bridge is set up, it does not need a PC or software to function. Simply connect its HDMI output to a monitor or other device.

The ATEM Streaming Bridge setup saves the settings and network address details at the streaming site to a .xml file that can be emailed to the receive site.

The ATEM Streaming Bridge setup saves the settings and network address details at the streaming site to a .xml file that can be emailed to the receive site.

The ATEM Streaming Bridge connects to the computer with a USB-C cable. Settings are adjusted with the ATEM set-up application, the same application used for setting up the ATEM switcher in the ATEM Software Control. The ATEM Streaming Bridge uses port-forwarding through the internet firewall to allow the ATEM Mini Pro to connect to it.

The ATEM Streaming Bridge can automatically find your Worldwide IP and open your Worldwide Port 1935. The Streaming Bridge settings can then be saved as a .xml file, emailed to the ATEM Mini Pro operator and loaded directly into the ATEM software controlling the ATEM Streaming Bridge. The .xml file contains all the data necessary for the ATEM Streaming Bridge to receive the ATEM Mini Pro stream across the internet and feed it to its local HDMI output. Clearly, the Blackmagic ATEM system would have been the easiest to set up and use.

Works Fine Until Showtime
As the internet field camera backhaul plan fizzled, we were running out of set-up time and the show must go on. Fortunately, we had a Plan B: Use Ubiquiti Wi-Fi microwaves to double hop the condo-located Matrox Video encoder to the studio-located Matrox Video decoder. It worked and no viewer ever knew that our exciting new field-camera internet backhaul plan failed to launch. That’s the magic of live TV production. The announcers kept the secret and acted like the show was going exactly as planned.

Don't depend on stable Wi-Fi activity at normally quiet venues that don’t usually host major one-time TV events. The morning of the event, tens of thousands of race fans with cell phones, TV reporters, photographers and other private and government organizations will show up and turn on every wireless device they brought. One serious Wi-Fi user that always sets up early on race-day morning is the local Emergency Management Communications truck. It streams Wi-Fi video from multiple temporary security camera locations to and from the truck, using substantial chunks of the 5.2 GHz Wi-Fi band. If something spectacular happens, local cellular data can get overloaded when everyone in the crowd tries to stream their video at once.

The morning of the event, the Wi-Fi band is loaded with new signals searching for and using the best channels that have nothing to do with our broadcast but happen to be physically in-line with our directional Wi-Fi antennas at the studio. The Ubiquiti Airview Spectrum Analyzer easily displayed amazing Wi-Fi congestion growth.

I wouldn’t be surprised if Wi-Fi noise or unusual RF interference creating stray bits isn’t the network gremlin that nearly always infects some data on our studio LAN, but only on the days of live powerboat racing broadcasts with huge crowds. Intermittent digital anomalies large and small always somehow get into our LAN, randomly affecting network audio and/or video data during every race. We never experience these anomalies during set-up and rehearsal days or inside the station's permanent TV studio. I’m thinking maybe it’s not live TV karma.

Wi-Fi issues and accessible high-speed wired connections at better camera locations is driving our interest in wired internet backhaul. Wire and fiber are stable, reliable and private. Numerous, less-expensive internet streaming systems were also recommended during our research but there wasn’t time to load, learn and try anything more before the broadcast. Those systems include Open Broadcaster Software (OBS), vMix and VCam. We will field-review some of these amazing low-cost solutions and the new NDI 5 Tools as soon as possible after its release.

Broadcast Bridge Survey

You might also like...

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

5G Broadcast: Part 6 - Technical Dive Into 5G Broadcast & New 3GPP Standards

Standards bodies and mobile technology developers are putting the finishing touches to 5G Multicast and Broadcast. These include enabling seamless switching between unicast and multicast, and equally transparent roaming for users as they move between mobile cells. There is also…

The Streaming Tsunami: Securing Universal Service Delivery For Public Service Broadcasters (Part 2)

This is the second part of our discussion of one of the biggest challenges for national Public Service Broadcasters; how to maintain their obligation for universal service in a future landscape where audiences have migrated to streaming as their primary…

5G Broadcast: Part 5 - 5G Contribution & Remote Production

The main focus of this series is on the potential impact of 5G Broadcast on content delivery, here we take a look at how this might combine with 5G contribution to form a 5G transport ecosystem.

Standards: Part 4 - Standards For Media Container Files

This article describes the various codecs in common use and their symbiotic relationship to the media container files which are essential when it comes to packaging the resulting content for storage or delivery.