The studio balcony provided an up-close view of World Championship powerboat racing and wild sound for announcers and a worldwide audience.
Most live remote outside broadcasts are thoroughly planned by producers and directors who are often too busy to consider potential equipment problems. Technology is an engineering responsibility. Engineers must be ready for any circumstances that threaten to take the show off-script or off-air, from dead wireless mic batteries to unexpected foul weather. In live TV, anything can happen and probably will, usually at the worst possible time.
To paraphrase a 1786 Robert Burns poem and later John Steinbeck book title, ‘The best laid plans of mice and men often go awry.’ I've heard this phrase echoed in headsets, control rooms, hallways and TV conference rooms because it happens more than it should in TV despite extensive preparations for success. Some call it live broadcasting karma and it is a TV engineer's job to avoid it.
Producers and directors make a plan, stick to it and maintain control. An engineer’s job is to monitor and adjust electronics systems operations to all dynamic situations, from camera shading to catastrophic equipment failures, maintaining seamless and technically correct program content flow at all times.
In the case of live OB sports production in public venues, variables like bad weather, equipment and broadband service reliability, scheduling, and even parking can make a live outside broadcast go awry if not identified, assessed and addressed before showtime.
They Won’t Know Unless We Tell Them
Most surveys confirm TV viewers want to lean back and be entertained. Live outside TV broadcasting is an on-stage magic show, complete with digital smoke, mirrors, and occasional deceptions. News anchors in coats and ties sometimes wear shorts and sandals under the desk, don't they? It's also a rock concert that requires improvisation and faking it when necessary. The undercurrent is a real-life version of 'Beat the Clock.'
You'll never get a second chance for an on-time, on-air premiere. The key to successful magic shows, concerts or live video production is to be fully prepared on time and not reveal what the audience can’t see. It’s a TV show, not a 'making of' documentary.
On-air discretion is everything. Mum’s the word about behind-the-scenes live TV and magic tricks. Live TV directors I learned from always reminded talent and crews that no matter what happens on the air, carry on as if we planned it that way. Never insinuate there is any issue behind-the-scenes because most viewers won’t notice. News story not ready for air yet? Viewers can't see the rundown. Tease it again or ignore it. Only the station knows. It's a magic show that broadcasters make look simple.
I recently engineered and directed a two-day World Championship Offshore Powerboat racing broadcast that was infested by more than the usual number of live TV gremlins. There are always a couple of gremlins. This show had 12. You may have found yourself similarly challenged with live, OB TV circumstances. Your reaction could change the show and your career, because in showbiz you’re never any better than your last show.
Live TV is a magic show, but TV engineers aren’t magicians. Engineers must prepare for anything and everything to go wrong. The trick is to persuade the production team and managers when it is time to abandon the original plan and move on with an alternative that saves the show and that viewers won’t notice.
To do so successfully requires a ready Plan B and the diplomacy of a skilled politician to sell it to the crew and producers. Nobody likes the disappointment of changing production plans in the middle of a live TV show. Some people must first be sold on the idea.
One powerboat racing gremlin popped in about 6 p.m. the night before the broadcast was scheduled to start at 11 a.m. the next day. That’s when race organizers called to tell us they rescheduled the race to begin at 10 a.m. instead of 11. We called the production crew members and informed them of the time change, but it was pointless to call any broadcasters who expected the broadcast to start at 11. Logs were printed, spots were sold, and other programs were scheduled for broadcast at 10 a.m. Instead, we told our broadcasters the situation and faked it by recording the first hour and playing it without fanfare over the last hour of the moved-up broadcast. Most viewers probably didn’t notice.
Not Carved In Stone
While directing the live Powerboat Race TV show and operating the production switcher, unforeseen changing circumstances suddenly required me to setup and operate a camera aimed down the racecourse out a nearby window with my other hand. During the show the wind blew our microwave receiving antenna on the studio roof over twice and it shattered its plastic case the second time. When erected again, the link still worked but the video quality was unusable. That microwave carried three IP cameras including a primary racecourse camera on a scaffold on the beach.
The studio window camera covered for the offline scaffold camera angle which was about one mile down the racecourse. The other IP cameras were beauty shots from a PTZ on the boom lift and link to a roll-around camera for interviews in the pits. It wasn’t the plan, but nobody but us in the studio knew we were in Plan B. Thank goodness for a 22x lens, a good fluid head, and SD cards for "looks-live" interviews.
Part of the engineering plan is planning to fail. What happens if the production switcher fails? That thought is always in the back of my mind, and we’ve been fortunate to not use my plan, yet. My plan is to switch the entire show on our Ensemble Designs Bright Eye router with an iPad controller. It wouldn’t look slick, but not as ugly as a time way back when I switched a live 5-minute TV news insert with a patchcord because the production switcher failed at 6 a.m. The TV transmitter didn’t like the glitches of patchcord-switching, but an engineer must do everything that can be done to keep the show on the air with what’s on hand. The show went on, we aired all the spots, and some viewers probably thought there was something wrong with their TV.
Default IP Addresses
Another gremlin appeared when the local cable company installed a cable modem for our internet service. We were relying entirely on streaming video to feed the show to our network affiliates, YouTube, and Vimeo. The first thing the new cable modem did was change the address of our private router from 192.168.1.xxx to 10.1.10.xxx. What?
We had locked in all our gear to static IP addresses, but the sudden 10.1.10.xxx router address change over-wrote the static addresses in some of our production gear. That problem lasted a couple of hours while the installation tech called the boss to get help resetting the cable modem IP address to 192.168.1.xxx. Of course, installers aren’t allowed to get into customer’s gear, so we verified and made all necessary local static IP address corrections ourselves.
The cable service speed test showed the bandwidth we paid for, but it turned out to have an occasional glitch. We backed up the cable broadband service with a pair of Verizon portable modem/routers that provided rock-solid broadband service throughout the broadcast.
We use 5GHz Wi-Fi for camera links, but we don’t use Wi-Fi connections in our temporary studio. All our studio production gear is hardwired. We’ve learned that a random smartphone entering the studio with its Wi-Fi on can assign itself an apparently unused IP address and eventually conflict with a network studio device assigned that same static address when it becomes needed online.
We rented a LiveU Solo to stream to our primary broadcast affiliate that happened to be a LiveU station. The Solo seemed to be a great match, but it turned out to be just the opposite. During a phone call to LiveU tech support, I learned that the TV station’s LiveU Enterprise system was incompatible with a LiveU Solo. I had assumed otherwise.
Fortunately, the TV station had a computer with VLC Media Player available to decode our RTMP stream for broadcast. It worked, but VLC certainly wasn’t in the original plan. That’s also when we discovered our cable modem service was not as stable as it should be, because every couple of hours the video stream to the station was dropped and Solo disconnected from VLC. Who knows, but I'm blaming it on the occasional cable modem glitch. We made it work because everybody was flexible, thank you very much. What’s the one word that makes challenging live OB productions look good on TV? Flexibility.
There is more to this story than space available. Please standby for Part 2, about drones, boom lifts, field cameras and lens filters.
You might also like...
In part one of this series, we looked at why machine learning, with particular emphasis on neural networks, is different to traditional methods of statistical based classification and prediction. In this article, we investigate some of the applications specific for…
For a serious discussion about “making streaming broadcast-grade” we must address latency. Our benchmark is 5-seconds from encoder input to device, but we cannot compromise on quality. It’s not easy, and leaders in the field grapple with the trade-offs to en…
Outside Broadcast connectivity using managed and unmanaged networks is delivering opportunities for employers that enhances productivity through flexibility, scalability, and resilience.
Signal transducers such as cameras, displays, microphones and loudspeakers handle information, ideally converting it from one form to another, but practically losing some. Information theory can be used to analyze such devices.
Connecting a camera in an SDI infrastructure is easy. Just connect the camera output to the monitor input, and all being well, a picture will appear. The story is very different in the IP domain.