The REMI control room and crew. Instead of racing to pack up gear when the show ended, we shut it down, turned off the lights and went home.
Remote Integration Model (REMI) production is more than remote cameras. It’s a new way of thinking and working. This tale of trying to implement a REMI production model within tight financial constraints highlights some of the operational challenges involved.
Much has been written and said about REMI production. Like much of remote TV production, the ease of producing and engineering it is typically tied to its budget. With enough budget, nearly any TV production can be easily accomplished with a 53’ production trailer outfitted with multiple pro cameras, long lenses and engineers and operators who do major productions all the time. The challenge people face creating content that appears to have been produced as previously described is to do the same for a tiny fraction of a major production budget.
The station I’ve been doing live TV broadcast work with for the past 15 years decided this year to move to REMI because it makes business sense. It’s fun to travel and work at interesting and exotic temporary venues, but crew travel and hardware shipping expenses and costs add up fast and can take a big bite out of profits. With REMI, most of the production crew can stay home.
Rolling Out REMI
In theory, all a REMI production needs are a permanent studio control room, gear to stream individual cameras via solid internet connections back to the permanent studio, a field producer, and good cameras with experienced operators. The variables are internet connections, image quality and camera operator experience. Live TV is not the time for a camera operator to exclaim “What’s that?” when asked to white balance or change a shutter speed.
All the necessary studio gear neatly fit into a well-ventilated former TV transmitter rack.
The station chose a recent local Political Candidate Forum to take REMI for a local test drive before committing to REMI for our usual out-or-town powerboat racing live TV production.
In preparation for REMI, the station converted a former cell phone equipment building at the base of a cell tower into a permanent TV production control room, designed for REMI. The REMI Studio control building is across a parking lot from the station’s Master Control room.
The first step was to install a desktop surface for control panels, mixers, keyboards and mice on two of its walls, and hang the necessary monitors above them. The second step was to build a rack to hold all the computers, routers, production switcher, routing switcher, h.264 decoders for the field cameras, and other necessary electronics. A USB hub for each device with a USB input was installed near its wall-mounted monitor. The third step was to send a h.264 program stream across the parking lot to the TV station master control room feeding the transmitter STL.
The TV station building has GbE fiber internet and a GbE LAN. It provided ample bandwidth for three incoming streams and one video stream out. We hired the local cable company to provide private service to the meeting room the Forum would be held in.
Earlier this year, we tried streaming a show out of the same meeting room in the restaurant/entertainment complex using their in-house internet service and LAN. Its speed checked at about 100 Mbps down/10Mbps up, but at showtime it became unstable and unusable. Fortunately, we had a cellular data back up to stream the show to the internet and station from our control room table in the meeting room.
Across the Parking Lot
The cross-parking lot program stream proved to be trouble. Streaming at 35 Mbps on our simple GbE LAN, our NDI Spark feed dropped or froze a couple of times each second, and the audio was unusable. We recently retired our Matrox Video encoders and decoders and fortunately didn’t toss or sell them. We recommissioned a Monarch EDGE encoder to create the stream and get it on the LAN and decoded it with Open Broadcaster Software (OBS) in the TV master control room across the parking lot. The Monarch EDGE to OBS stream was rock solid.
The Forum required 7 live microphones for the candidates and moderators, but we didn’t have the capability to send 7 channels of audio back to the studio. Instead, we used a separate on-site audio mixer operated by the field producer and sent to the line input on a camera. The production used three cameras, two locked down and one with an operator. A total of five production people were on site – the camera operator, the audio/producer, two talent, and a timekeeper with cue cards to show candidates a countdown to their time limit. The cameras were connected to local encoders connected to the cable modem/router.
The graphics operator simplified the production by turning the graphics key signal on and off at the CG.
Three people were at the fixed REMI studio - the video switcher/director, audio mixer/producer, and a graphics operator. A typical offshore powerboat race production would also require a statistics person, an instant replay and social media operator, and camera operators. In our case, REMI could allow 10 crew members to not travel to the race venue and sleep in their own beds. The T&E for 10 people for 5 days is approximately $20K USD.
With REMI, we only need to send an EIC/producer/camera operator. Talent can work in front of the green screen in our Master Control room. Two camera operators and local drone services will be hired locally. Everyone but the EIC can eat and sleep at home. After everyone is paid, the company stands to save about $14K USD on crew expenses compared to doing the entire production on site, old-style.
Okay, What’s the Catch?
The REMI experience is controlled by all kinds of digital connections that can be a bit unnerving for production crews used to zero-latency, copper connections. It’s not unlike using satellites for live TV. There can be annoying signal delays. Not enough to ruin a production, but enough to delay a sidetone so I couldn’t count to five with a sidetone delayed by about ½ second in my ear. It turns out the comm solution has more adjustments and settings than we had time to experiment with. We got it working 'good enough' and moved on.
Delay confuses crews. We couldn’t afford a high-end broadcast comm solution, so we went with a less-expensive LAN/WAN comm system found on the web. The specs looked good, but we found a few issues in addition to the delay.
The secret of great live REMI production is fresh pizza for the crew.
Our new communications system is based on an iMac, iPhones, and Bluetooth. The iMac communicates with the iPhones and the network using Bluetooth USB dongles on machines on the network.
Not all iPhones are the same. Some worked well, others had issues. One problem was that only two Bluetooth phones could be on the system in one room, such as in our control room. Setting up for our remote phones wasn’t a problem, but in the REMI room, we had three operators and only two Bluetooth connections. The graphics operator volunteered to abandon her headset for a cellphone call. It wasn’t as we planned but it was a viable work-around.
What makes REMI work is the camera/technical director (TD) connection. If the camera and operator are working properly and following the director’s instructions, the TD will use it. If not, the camera operator may as well have stayed at home. At some point in the future, I’m sure we’ll be integrating PTZ cameras without operators for live TV production, however boring it might be without having a field operator online to tell us what’s happening that we can’t see on camera.
You might also like...
IP connectivity delivers flexibility and scalability but making the theory work often requires integrated solutions that are adaptable, open, and promote interconnectivity.
In the previous article in this series, we looked at layer-2 switching and layer-3 routing. In this article, we look at Software Defined Networks and why they are so appealing to broadcasters.
It was late in 2018 when a major public broadcaster in the UK came to London-based 7FiveFive, a technology solutions provider, with a growth challenge. Their postproduction department had about 75 edit positions throughout the building working off a shared storage SAN…
Machine learning is often compared to the human brain. But what do they really have in common?
As users working from home are no longer limited to their working environment by the concept of a physical location, and infrastructures are moving more and more to the cloud-hybrid approach, the outdated concept of perimeter security is moving aside…