The Changing Face of Broadcast – From Virtualization to the Cloud

Like many other industries, the broadcast space is affected by changes in the market, the business models and subsequent pressure on budgets, leading to the need to the search for efficiencies. Ultimately broadcasters are in a continuous battle to find better ways to monetize their content while keeping their audiences engaged and entertained for longer. One of the technology trends that is enabling the transformation of broadcast is the move to IP. Broadcasters are seeing the benefits in using a consistent, standardized infrastructure and connectivity to transport video everywhere, from locations to the central facilities and on to distribution. These benefits include cost savings, agility and the ability to scale. But, more importantly, IP is a critical building block that will drive other changes in the industry, including to infrastructure. Virtualization will play a key role here, particularly in live production and its impact on transforming workflows.

Virtualization in broadcast

While a lot has been written about the move to IP, what exactly is virtualization in the broadcast context? In a general sense, virtualization (a term that’s been around in the IT industry for decades) is the creation of something, such as a computer or platform, in a virtual instead of physical sense. It involves the use of software to provide functionality that appears to come from a dedicated device.

In the broadcast context, virtualization can mean one of two things — having equipment that can perform multiple, often varied functions (what Nevion calls media function virtualization), and enabling equipment to be shared more easily, for example between studios or locations (something Nevion calls infrastructure virtualization).

Media function virtualization means that the functionality required, such as media transport and processing, is performed by software rather than hardware. Initially, this software will be running on specialized platforms, but in time “media function virtualization” will involve software apps running on generic IT hardware platforms (essentially high-powered computers). The key point though is that the same hardware can run different functionality, and that functionality can be modified on demand and remotely with the help of a virtualization orchestrator.

Media function virtualization brings savings through reduced hardware replacements, lower space usage, as well as lower cost of training, maintenance and management.

Infrastructure virtualization is a way to enable equipment to be shared more easily for live and file-based production, for example between studios or locations. Equipment sharing is obviously a concept that exists already to some extent in the baseband world. For example, it is possible to route workflows through a piece of signal processing equipment that is shared. However, infrastructure virtualization takes this a step further, by detaching the physical equipment from the production workflow and greatly automating the process. The advent of IP makes this much easier.

Infrastructure virtualization is achieved by using a software management layer (or virtualization orchestrator) to elevate workflows to a level that does not require users to have any understanding of the underlying connectivity of the network and equipment. The orchestrator is intelligent enough to change the network based on information provided by the network itself, so effectively the network topology is software defined. Virtualization provides a better overview of accessible resources across the infrastructure, as well as an easier access to the resources in the network, without need for additional physical setup.

Olivier Suard is marketing director at Nevion.

Olivier Suard is marketing director at Nevion.

The advent of cloud computing

In time, the virtualization of live production using the cloud will revolutionize the way broadcasters work. For many in the industry, it is inconceivable that cloud technology could be applicable to live production.

Traditional views are that equipment should be owned by the broadcaster, just as it always has been, and situated on the site where it is needed, like a campus. However, with the advent of cloud and virtualization, these assumptions are no longer necessarily true.

The broadcaster’s pooled signal processing and transport equipment (e.g. embedders, encoders, multiplexers) could be owned and managed by a service provider. This is particularly conceivable as functionality moves to software running on generic hardware: pooled equipment would become computers, and service providers are experts in running datacenters.

If the processing and transport equipment is pooled and managed by a service provider, why keep it on-site? After all, the service provider could share some of that functionality with other locations of the broadcaster or even other broadcasters, effectively making the cost of use much lower.

The reality of modern, dedicated networks is that comparatively low latencies can be achieved over considerable distances. This makes it totally possible to locate a live-signal processing datacenter quite some distance away from the broadcasters’ facilities.

Cloud will also make it possible for broadcasters to swap capital expenditure in equipment with operational expenditure in pay-per-use processing services. As well as the flexibility this offers them, it also allows them to line up costs with the expected revenue from content creation.

Furthermore, this concept can also enable broadcasters to offer their equipment as a service to other broadcasters. In this scenario, broadcasters would be service providers, and transform their equipment from costs to revenue.

Conclusion

The broadcast industry is changing at pace and technology is both a driver of that change and an enabler of it. Going forward there will be continued emphasis on adopting IP in more places in the broadcast ecosystem and fully realizing its benefits. This will lay the foundation for other dramatic changes — like virtualization and cloud migration — that will effectively transform the landscape.

You might also like...

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Video Quality: Part 1 - Video Quality Faces New Challenges In Generative AI Era

In this first in a new series about Video Quality, we look at how the continuing proliferation of User Generated Content has brought new challenges for video quality assurance, with AI in turn helping address some of them. But new…

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.