Creative Technology - HPA Review: Remote Operation
For decades, a television studio’s production team has been no further from the action than a cable can comfortably be run.
It's been a while, though, since systems such as JVC's Connected Cam have made it possible to put the gallery a short-haul plane flight from the cameras, albeit at the cost of a little compression. Meanwhile, the sort of productions that send trucks across time zones in search of no-compromise results tend to demand low-latency, uncompressed pictures, and while that creates challenges of its own, distributed production is growing in popularity there, too.
When the subject of distributed production came up at the Hollywood Professional Association's 2022 Tech Retreat, some of the biggest names in outside broadcast were there to discuss ways to make it work even at the most demanding levels of broadcast production. The panel included Scott Rothenberg, senior vice president of program and planning at NEP, Phil Garvin, founder and CEO of Mobile TV Group, and Tony Cole of NFL Media in a session run by Mark Chiolis in his role as co-chair of the Tech Retreat's TR-X committee.
"NEP has led distributed production, especially in Europe," Chiolis says. "I think their first one was in Holland; they put a couple of centralised facilities there for all the local productions. They'd send a truck out, get the cameras set up, and all the production would come back to a centralised facility. They've done the same thing in Australia and a few other places." The attraction of doing this, we learn, is that travel time and costs collapse to a fraction of what they once were.
That's possible, Chiolis goes on, simply because of the time economy of having staff work at a central location, with recent world events also a motivating factor. "Instead of having to travel in and travel out, people can be doing projects on what would have been travel days. Potentially they could do two shows a day if it makes sense and the timing is right. It's making better use of people and saving money, but also, during covid it was a huge thing. People didn't want to travel as much; it was much harder to travel and took more time to do it, and once you got on site there were issues of spacing people out and wearing masks in the mobile units. Sometimes they were bringing on an additional mobile unit just to have space for people to spread out."
Mobile TV refers to its approach to distributed production as Cloud Control, an arrangement by which a team can effectively remote-control an otherwise standard production truck from any location that has a reliable internet link. Crucially, the final image doesn't go over the internet - the only material that's networked is the multiviewer feeds, talkback and other control services. "You're still sending the truck out to the event," Chiolis confirms. "Instead of sending the camera feeds back, you send multiviewers. You've got full communications, all your video, your control, graphics, EVS, it's like you're sitting in the truck, but you're in a centralised facility."
Chiolis is keen to emphasise that the final production image is not compromised by this approach, with the final material handled as it always would be. "A lot of these configurations are not sending any picture back other than a multiviewer feed," he says. "The cameras are uncompressed just like they would be because the trucks are at the venue. You're just not sending all the people, you're viewing the images with a sub-100-millisecond latency using private networks to keep the latency down. So, it's not much different than operating from a mobile unit."
Precisely because the remote control facility handles no final production material, the hardware install is comparatively inexpensive. "They can be built relatively cheaply because you've got no hardware other than a control panel, comms gear, consumer flat-panel monitor, and an IP connection. You can build those in a conference room or a large office, and again you can utilise your best people for additional time because you're not travelling them."
While much of the usual crew complement can remain back at base, people concerned with primary quality control, and perhaps a couple of others, must still travel. "The vision engineers go with the truck," Chiolis says. "The reason you want to travel those is if for some reason you lose your connection you still have people there on site who are doing the broadcast. If you lose comms with the director, most of the time you have someone there who can cut the show. So, instead of travelling between 10 and 20 people with the truck you're travelling three."
New facilities are being built with this approach very much in mind, particularly in US sports broadcasting where stadia often have elaborate installations, where home and away teams might produce separate coverage. Tony Cole's recent appointment at NFL Media is, Chiolis says, connected very much to the new NFL Los Angeles facility adjacent to the SoFi Stadium in Inglewood. Chiolis describes it as "a brand new NFL media building, creating the opportunity to move from the traditional workflow to an IP one. Tony was very excited about that. He was talking about how the building was done, how it's interconnected to all the teams and how they share all their resources. For the recent Super Bowl here they did the half time show out of their control room instead of using a truck."
With fibre-based internet connections for home users now often approaching a gigabit of bandwidth, the few hundred megabits required to make this sort of distributed production possible might not seem like much of a concern. Although the broadcast television industry has rather different expectations about reliability than the average Playstation enthusiast, Chiolis describes a situation in which venues are increasingly equipped with sufficiently performant network infrastructure as a matter of course.
"Most of the venues that you're going to find for large sports or entertainment events will have those connections," he says, "and now you're even seeing connections with 5G providers be able to do that on a wireless link. I've seen some testing with wireless, but for the most part people are using cables, with a dedicated line to make sure there are no surprises. They're reasonably priced now at those sorts of bitrates. Phil had talked about the Mobile TV Group Cloud Control projects where three hundred megabits covers your multiviewers and control and they were maintaining sub hundred-millisecond latency."
Often, the distributed production workflow will not be the only load on the network, with streaming services increasingly part of the equation. None of this, Chiolis says, need displace any of the more traditional approaches. "The other thing people have talked about is that they're potentially streaming, and also doing an in-camera recording for post after the fact, or they're doing a full bandwidth line cut in the flypack or mobile unit and they'll distribute it later at a higher bandwidth."
News of this technology, and the huge savings potentially associated with it, might suggest a gloomy future for purveyors of the kind of satellite links which have long been used to facilitate broadcasts just like these. For now, though, that three-hundred-megabit stream shows no immediate signs of growing to replace a traditional uplink. Quite the opposite; Chiolis suggests that an abundance of caution sometimes pushes users toward a belt-and-braces approach.
"For satellite links, I think it depends where the end product is going," he explains. "Satellite time is expensive, but so is fibre time if you're buying that. In the mobile industry in general and in the live production industry, for big projects and even regular projects you may still see a satellite there with a backup fiber link. Depending upon what kind of money and revenue is involved it may be worth it to have those if they're already in place at the venue and all you have to do is fire them up."
As with so many innovations in film and TV, network-based distributed production will benefit from improvements in general computing that aren't specifically targeted at the sector. Progressively faster networks will only make the whole approach work better, and the significant cost savings of crew time and travel expenses are likely to tempt more and more production companies. Whether missing out on travel might be a downside for some people, particularly those at the beginning of their careers, is a matter of opinion. "Some of them like it and some of them don't," Chiolis reflects. "As the crews get older, they like sleeping in their own bed, but at least in my opinion it's nice to get out once in a while. Still, I don't want to be on the road 250 days a year."
You might also like...
Compression: Part 9 - Transforms
Image compression depends heavily on transforms and there are a number of these. The criteria for selecting a transform include appropriateness for the type of images to be coded and computational complexity.
Delivering High Availability Cloud - Part 1
Broadcast television is a 24/7 mission critical operation and resilience has always been at the heart of every infrastructure design. However, as broadcasters continue to embrace IP and cloud production systems, we need to take a different look at how we…
Understanding IP Broadcast Production Networks: Part 3 - Resilience
How distance vector routing simplifies networks and improves resilience.
The Technology Of The Internet: Part 2 - Bandwidth And Adaptive Streaming
How CDNs, Adaptive Bit Rate streaming protocols and codecs have evolved together to take video streaming forward from the earlier inconsistent and low-quality experience, to deliver broadcast quality today in many cases.
System Showcase: Ireland’s RTÉ Adds Video To Its Radio Studios To Increase Content Value
RTE’s move to new studios prompted a project to add more sophisticated video capabilities to its new radio studios, reflecting a global trend towards the consumption of radio online.