Sitting at home watching the Olympics 400m Women’s hurdles final live on NBC’s 4K HDR channel, home audiences were captivated by the sweat and effort displayed on screen with immersive sound of the runners’ feet hitting the track. Viewers thousands of miles away could be excused for thinking they had the best seat in the Japan National Stadium. The live 4K HDR broadcast of NBC’s primetime show throughout the Games were an extrasensory experience unlike any previous Olympics telecasts.
With mature, cloud-based services now prevalent across the industry, helping to process and distribute content faster and more accurately than ever before, the long sought-after promise of producing content in the cloud—reducing cost and physical barriers—prompted broadcasters and production companies to experiment with new ways to make it a common reality.
There was a time when the mere mention of bringing artificial intelligence (AI) and machine learning into the media industry brought visions of robots replacing humans. Today that is certainly not the case—although we might be getting close: I saw a robotic camera operator move the cameras for a national television news show from his converted kitchen table. On-air, viewers never saw a difference from the programs they always watch.
If there’s one thing the production community has learned during the pandemic, it’s that ensuring the safety of the crew on site or in the studio should always be first and foremost in people’s minds. The second takeaway is that sending less people on site and implementing more remote support is the new normal that the industry is, somewhat begrudgingly, coming to terms with.
This is the second instalment of our extended article exploring the use of Microservices.
Computer systems are driving forward broadcast innovation and the introduction of microservices is having a major impact on the way we think about software. This not only delivers improved productivity through more efficient workflow solutions for broadcasters, but also helps vendors to work more effectively to further improve the broadcaster experience.
After years of trial and error designed to reduce operating cost and (more recently) keep crews safely distanced, remote production has found its niche in live production and will remain the de facto method for producing events over a distributed network infrastructure. However, a big hurdle left to overcome for successful deployment of such networked workflows is latency. In live production, video latency refers to the amount of time it takes for a single frame of video to transfer from the camera to a processing location (on premise or in the cloud) and back to the display—wherever that display might be.
With the pandemic’s alarming numbers now decreasing, news anchors have carefully begun reporting from the studio again, albeit in separate parts of the building and socially distanced. However, the IP-enabled technology and remote workflows developed by equipment vendors across the industry during the worst of it have endured and will for some time. These new tools allow reporters, producers and technicians to work from home by streamlining the process of producing a newscast.