Acquisition Global Viewpoint – December 2018

IP Production in Fly-Over Country

The FCC’s new goal of 25 Mbps ubiquity in rural areas will open a new world of live field production and distribution opportunities to TV facilities and producers using IP camera connections and live internet streaming.

Private IP video transport is becoming increasingly popular for camera video and audio connections at live video productions. Inside a private IP network, artifacts in wired or wireless video transport are either invisible or impossible to ignore. We avoid the latter by maintaining complete control of the network.

On the other hand, multi-gigabits of internet fiber are commonly found in major-league sports venues and most big cities. Permanent, reliable, on-site, wide-broadband access makes it simple to transport individual cameras back to a central studio for live production. The central studio control room crew can operate everything from PTZ cameras to directing, switching, live graphics, replays and audio mixing, to distribution by streaming and uplink.

The upside to central production is that everyone on the studio crew goes home every night, which makes it easier on crews and budgets. The downside is that adequate bandwidth can be expensive and its far from universally available, yet. The alternative of hours of bonded cellular streaming can be more expensive than a profitable project can bear.

There are many live events capable of monetization with live-TV production that don’t have millions of fans and aren’t near big cities. Many occur in places where the local bandwidth can be marginally adequate for HD video during showtime. I happen to engineer and direct live TV offshore powerboat racing. What we’ve learned is that shorelines are the edge of the line for wired and wireless internet connections. More than once, we’ve streamed to the internet through a TV uplink truck to a TV studio that could turn around our satellite feed to our LiveStream and YouTube streams through their provider, because our local ISP couldn’t do it reliably.

Fly-Over World IP

Most rural venues depend on service from local ISPs such as cable TV and telcos. Expecting multi-gigabit service at a less populated location might prove a bit hopeful. A one-time IP-based production can ultimately depend on the work of the local installer who may have never been asked to set the bandwidth for 80% upload, 20% download before. The primary bandwidth bottleneck for internet streaming is typically the copper distance and traffic to the nearest fiber junction.

Inside our usually rural, on-site control room, everything is on our own private local internet and under our control. That was the idea, but a recent world championship race demonstrated it wasn’t always the case. During set-up and test, my iPhone made itself the first problem.

In less time than it took to reboot the production switcher, my iPhone ditched its previous IP address and grabbed the switcher's unused static address on our private network. Let the troubleshooting games begin.

That’s Odd

The symptoms of some network issues can only be described as weird, and they don’t always appear in the A/V. The switcher, for example, didn’t have to connect to other IP devices when it rebooted. It seemed to work okay at first, but strange new quirks began to appear and multiply. Other devices trying to shake hands with the video mixer also began acting unusual. It was a cascade of intermittent and new production issues.

Most video decoding issues are obvious on a display screen. More subtle issues such as IP address conflicts can wreak all kinds of quirky micro-havocs across operating systems and running programs. A micro-havoc would be an event such as suddenly missing files, odd program behavior, an intermittent network connection, or a combination of all three. The more live TV productions I do, the odder device and software behavior problems seem to become.

Most computer problems are usually resolved with a reboot, but some computers get even more confused with a reboot. The issue that can cause that symptom can often be traced to an IP address conflict. We thought we had resolved our IP issues by reserving the static IP addresses of all our gear in our modem’s settings. The problem was we used the local ISP’s modem at this particular production and we didn’t set it to reserve the static addresses of our production gear. I won’t get into how long it took to figure this out, but we fixed it a couple of days before the event.

Lesson learned: Set all TV gear to a static IP address and reserve all their static addresses on the network before the modem or router is connected to the private network.

Some other odd surprises also popped up during rehearsals. One was frame loss from remote cameras feeding 10 Mbps streams connected by 5GHz Wi-Fi. Our private network operated through a GigE switch. A temporary 10/100 switch installed on a shared Wi-Fi network antenna was the culprit. The 10/100 switch also seemed to contribute to another new issue, which could be best described as a network hiccup when a remote camera and encoder was turned on or off. The glitch made strange things happen across the network. A GigE switch fixed the issues.

Wi-Fi camera backhaul is another challenge that changes with the territory. Near the ocean, there is no Wi-Fi on the water and congestion in that direction is low. On a river or lake, there are often businesses, houses and condos on the other side using Wi-Fi channels for their internet-of-things devices. A spectrum analyzer can only tell you how close you are to the cliff at the time. Its amazing to watch how many new devices appear online over the weekend of a major event.

Risky Sharing

At one-time events attracting tens of thousands of people, Wi-Fi channels used for camera backhaul and communications can become as congested and unreliable as local cellphone service. Cable and telco nodes can get overloaded. Cell carriers can be requested to bring in supplemental cellphone service, because the day of an unusually large event cellphones can’t always be counted on. There are no such solutions for Wi-Fi congestion.

On event day, I’ve seen cellphone text messages take more than 5 minutes from a person only 100’ away. All wireless systems, including the crew communication system should be licensed and private. Avoid depending on public networks, RF bands or cell phones. Productions don’t work without adequate communications.

When 25Mbps internet becomes ubiquitous across the most isolated parts of the boondocks there will be no reason not to go fully IP. Fortunately, the government is preparing to help get us there.

FCC Chairman Ajit Pai.

FCC Chairman Ajit Pai.

New FCC Order?

One significant aspect of a 125-page order the FCC will consider in December is an increase in minimum rural broadband speed to 25Mbps. The same order includes additional Universal Service Fund (USF) dollars for rural carriers using the Connect America model (A-CAM), as well as for traditional carriers. Carriers accepting FCC offers to receive funding based on the A-CAM model must meet specific rural broadband buildout schedules that don’t apply to traditional carriers.

This past July, the FCC’s Wireline Competition Bureau (WCB) established a uniform framework for testing the speed and latency performance (Network Performance Testing) for recipients of Connect America Fund (CAF) support who serve fixed locations. The new testing framework will improve user QoS and QoE for urban and rural users alike.

Regarding the December proposal, FCC Chairman Ajit Pai recently said the FCC wants to increase the “target speeds for subsidized deployments” from 10/1 Mbps to 25/3 Mbps. That’s an ambitious and promising view of the future, but even subsidized buildouts take time.

Commenting is not available in this channel entry.