SVT implemented a remote IP production workflow for this year's FIS Alpine World Ski Championships
It was clear from NAB that there are a number of trends impacting the future of live sports production. But there are also apparent contradictions. With 4K still years away from dominating broadcast, UHD gear is widespread. The move to IP seems all conquering yet SDI is proving remarkably resilient. AI promises cost savings but at the cost of human curated quality? We put these conundrums to Mark Hilton, Vice President, Live Production, Grass Valley and you can read his responses below.
BroadcastBridge: What are the key technology developments (generic not product specific) seriously impacting live production today?
Mark Hilton, Grass Valley: With consumers now expecting richer, more captivating content, broadcasters are being more ambitious than ever with live production, giving audiences an up-close experience that is as good – if not better – than being there. There is increased consumer demand for stunning picture quality, and in a live production environment this means having camera equipment that is UHD HDR enabled, or equipment that is enabled to change the pixel size between native 4K and native HD inside CMOS imagers. Next-generation production formats combine resolution, dynamic range and color gamut (e.g., 1080p with HDR and WCG), and this doesn’t always require 4K cameras.
As broadcasters and production companies drive towards multi-platform and higher resolution content offerings, SDI infrastructures offer reduced flexibility to adapt and switch up production models and workflows. This is spurring the growing trend towards open IP infrastructures that can provide the scale and flexibility to support new service models.
IP is now a field-proven, reliable alternative, delivering the scalability and flexibility needed to adapt and evolve services to meet consumers’ ever-changing demands. It also enables broadcasters and production companies to look at new ways of supporting live events.
This year’s FIS Alpine World Ski Championships is a prime example; Swedish broadcaster, SVT implemented an end-to-end remote IP production workflow that allowed it to meet all its requirements with zero compromise on its ability to deliver the first class image quality that rights holders required. SVT was able to ensure that viewers saw no discernible difference and media transport was completely reliable even over large distances – audio and video feeds were perfectly synchronized and latency was kept to a minimum.
This project provides a demonstrable IP model that delivers on cost and time saving – all while delivering the stunning images that today’s audiences demand.
How can producers ensure that quality does not suffer as more automation/ AI is introduced or is there a necessary trade off between cost of production and quality?
AI is an emerging technology that has the potential to have a significant impact on live production today. AI is making delivering multi-platform, multiformat content much easier and quicker, and enables broadcasters to better understand the content in their live stream videos that are delivered to their facility. For example, AI can be used to retrieve intricate speech and facial recognition data to identify and aggregate important information that feeds into the coverage of unscripted productions like sports matches and breaking news stories seamlessly.
AI technologies provide key functionality that can be particularly important if a broadcaster is looking to limit personnel and free up staff to focus on high impact creative tasks. Eventually, the only human-centric tasks that we will see in broadcast environments will be the creative ones that ultimately result in rich, captivating storytelling.
Ingest is a key area where AI and machine learning can be most impactful. In traditional production workflows, the only metadata attached to clips at ingest is the file name. Using object, facial and voice recognition tools can help create rich metadata tagging, immediately making that information available to users across the organization. This allows media organizations to maximize the value of their content and leverage existing assets to tell richer stories.
Cutting clips for voice-overs is also an area that can deliver real benefits in live production settings. Usually, an editor would have to search for relevant images to accompany scripts being read out, a time consuming and inefficient process that AI has the potential to speed up considerably.
AI will also enable a broader scope of content creation. Today it is not economically feasible to produce and distribute certain live events, such as high school games or lower tier college sports. As a result, there would be a quality/cost trade-off to be made, but maintaining the creative elements (that would be difficult to replicate via AI) will continue to be important for lower profile live event coverage.
Is it easier to manage a UHD 4K workflow in SDI using 12G single cable or Quad 3G than using SMPTE IP standards and kit at this time?
Thanks to industry standards such as SMPTE 2110 and efforts by AIMS the industry has experienced a significant step change in the drive towards IP migration. IP-based solutions are now more widely available across the production chain from cameras to multiviewers, routers to switchers. The open standards approach is also driving down pricing; IP is no longer limited to – or relevant for – large multichannel installations or greenfield sites.
With end-to-end IP deployments now seen for live remote event production, by major OB operators such as Mobile Television Group (MTVG), we are reaching a tipping point where customers are no longer just looking at replacing SDI with IP to do the same thing with newer technology. As well as supporting stunning image quality and immersive viewing customers are looking to leverage IP to enable them to work in new and innovative ways. Fast, agile infrastructures are central to building successful media businesses and IP empowers broadcasters and content producers to work smarter, creating new service models that meet the needs of a digital native audience.
This wider move to IP still leaves scope for SDI in the medium-term. We see 12G SDI as most relevant in flypacks, small mobile units, smaller houses of worship and corporate applications, where the advantages that come from an IP infrastructure are less obvious. There is less setup and configuration required typically with an SDI infrastructure in comparison to today’s IP networks. This is significant in cases where the customer does not want or need scale or agility and values simplicity and a true 'plug and play' approach, therefore 12G SDI will be the preferred option for many of our customers.
What are the main impediments today to remote producing live sport using software-defined or virtualized equipment?
Software-defined systems require a high-level system design and capacity planning; this is more complex than for conventional production systems. To ensure smooth and seamless delivery, a virtualized production system requires detailed knowledge of work patterns, peak loads, concurrency of events and a multitude of other factors.
It then becomes a matter of a business decision as to what is required to define the overhead capacity to be designed into the system.
However, as standards continue to develop, solutions that enable seamless live production are continually rolling out, and as bandwidth efficiency improves further we are starting to see forward momentum. IP models go hand-in-hand with remote production capabilities, and we are already seeing this in action. IP-based infrastructures can dramatically reduce the amount of hardware equipment needed at a live sporting event, and with improved latency rates it is possible for just a camera crew to be sent out to the venue instead of having a large OB set-up in place.
Given the current prevalent remote production models the primary trade off over longer distances will be bandwidth costs vs. latency. For high-bandwidth applications it makes more sense to localize all of the equipment (except the cameras) in the central studio. For applications with constrained bandwidth, having the control interfaces in a remote location provides an attractive alternative model but comes with increased latency issues experienced by the operators. The further the distance between the venue and the control room creates a bigger challenge between latency and bandwidth costs. In response to this, Grass Valley's unique DirectIP solution aims to address these constraints.
You might also like...
Dealing with brightness in camera systems sounds simple. Increase the light going into the lens; increase the signal level coming out of the camera, and in turn increase the amount of light coming out of the display. In reality, it’s…
The human visual system (HVS) sees color using a set of three overlapping filters, which are extremely broad. As a result, the HVS is completely incapable of performing any precise assessment of an observed spectrum.
Over the century or so we’ve been making moving images, a lot of improvements have been dreamed up. Some of them, like stereo 3D and high frame rate, have repeatedly suffered a lukewarm reception. Other things, like HD, and e…
At one time the only repeatable source of light on Earth was the sun. Later it was found that if bodies were made hot enough, they would radiate light. Any treatment of illumination has to start with the radiation from…
Giving his unique view on NAB2019, Gary Olson considers and scrutinizes the big moving trends of consolidations, and casts clarity on the cloud, ATSC 3.0, and AI.