Applied Technology: Integration Strategies for QoS Monitoring in the Cloud

Legacy operations like production and playout are gradually moving to the cloud as more systems connect to a facility’s IT backbone. These advances have opened the door to also moving operations-based tasks like QC and QoS monitoring to the cloud.

Systems integrators are evolving along with the broadcast industry. More facilities are transitioning to IP networks and software-defined workflows and successful integrators are now as IT-savvy as they are knowledgeable about traditional hardware and A/V wiring.

A successful systems integration strategy has always been about making the right connections, and creating a harmonious workflow that enhances how people do their work. Though the components and connection points have changed, the traditional approach lives on: The integrator learns about the broadcaster’s goals, and evaluates system needs before determining the appropriate solution.

Thorough systems integration crosses a wide spectrum of systems and technologies. This story will focus on strategies for integrating an efficient and effective cloud-based monitoring system.

Robust Backbone

The strength of the network is naturally the first stop. The bandwidth and processing power associated with your IT backbone will play a significant role in how to distribute your advanced monitoring solution. For example, compliance recording locally or in the cloud, and live streaming to a remote multiviewer in good resolution can be adjusted to fit the available bandwidth. However, the challenges are compounded when dealing with a lesser network, such as 3G, for the backhaul connection.

While the cloud offers a multitude of benefits, its simple IT/IP interface makes for an efficient and quick storage solution. Image: securelink

While the cloud offers a multitude of benefits, its simple IT/IP interface makes for an efficient and quick storage solution. Image: securelink

The integrator must first understand the scale of the monitoring strategy, and the desired applications that will comprise the monitoring operation. This enables the integrator to better evaluate the requirements to support the end-to-end architecture.

On a base level, a needs survey will determine the bandwidth and processing power needs at the network backbone. That requires identifying the number of monitoring sites and their geographical locations, the number of program streams to monitor, the depth of monitoring across each stream, storage capacity and location, and streaming functions.

Using Qligent’s Vision cloud monitoring platform as an example, the basic data aggregation for confidence monitoring can generally operate at modest bitrates in the range of 64kb/s. Bitrate requirements will quickly grow as video streaming applications such as a remote multiviewing are added. In these cases, a minimum of 256kb/s at full frame rate is a reliable base recommendation for the operator viewing experience.

Most enterprise-level organizations will find the needs of the system lightweight overall, as multichannel monitoring can be centralized to a bare minimum of sites in the cloud. Once the IT core is built to support the initial deployment, the integrator can efficiently deploy the supporting hardware that brings the physical monitoring locations onto the network.

In many cases the expense of proprietary hardware is eliminated in favor of locally available, off-the-shelf servers. Beyond eliminating legacy wiring for network cabling – which on its own reduces integration time, labor and cost of materials – this is the part of the job integrators have always specialized in: positioning the equipment, populating the racks and making the necessary connections.

These connections, however, extend beyond network cabling to actual network configuration needs – notably IP addressing between servers and various data aggregation points in the field.

QoS, while easy to define, becomes more difficult to monitor and test as the number of links required and quantity of devices that are connected all increase. One effective way to cope is to rely on vendors who specialize in this type of technology to help design your solution.

QoS, while easy to define, becomes more difficult to monitor and test as the number of links required and quantity of devices that are connected all increase. One effective way to cope is to rely on vendors who specialize in this type of technology to help design your solution.

Software Configuration

Being in the cloud, the software element of the system is ultimately what will drive the operation. Once the servers are connected, the operating software can be configured and its parameters defined from any remote location.

The initial software configuration should focus on the number of program streams to be monitored, and the depth of monitoring required at each of the edge-point device locations. The number of locations and streams per location will vary greatly based on the number of TV delivery platforms connected to the cloud. For example, a terrestrial-only operation with 5 to 10 channels will be significantly less intense than a deployment that crosses cable and/or satellite systems with hundreds of channels.

The depth-of-monitoring configuration will reveal detailed insight into several critical user-preferred parameters. The physical and transport layers associated with standards-based Quality of Service (QoS) must be defined. The video and audio layers associated with the Quality of Experience (QoE) from a viewer perspective must be identified and selected. The data layer that comprises embedded services such as captions, subtitles and EPG also requires setting. **

A majority of the processing, analysis and recording can be accomplished locally at the edge-point device. This will provide robustness against limited network backhaul, outages and similar problems. The integrator’s job is to balance real- time functions with cloud virtualizations and cost, based on network infrastructure capability. This information is networked to the cloud, where the content is received for visualization and correlated analysis on the customized monitoring dashboard.

In a cloud system, the integrator can configure these parameters locally or remotely over the network. In the case of systems like Vision, the supplier can alternatively handle the remote configuration and connectivity in cooperation with the integrator as they complete their onsite system buildout. In addition to these base requirements, parameters across performance alerting, threshold limits and user permissions are all configured in the process.

A final point of necessary consideration is network storage capacity. The integrator will need to scale storage capacity to suit system specifications. The amount of storage hinges on how long the broadcaster wishes to retain compliance recordings and transcoded video.

The storage requirement also generally depends on whether the deployment is on-site, offsite (offloaded to a managed services partner who handles the monitoring), or located on a public cloud like Microsoft Azure. Naturally, future scale must be considered based on the number of program streams that might possibly be added in the future.

One key looking forward is to keep your options open. New technology and opportunities may develop. Don't let old thinking prevent new profits. Image:Virtualshelley

One key looking forward is to keep your options open. New technology and opportunities may develop. Don't let old thinking prevent new profits. Image:Virtualshelley

Emerging Business Opportunities

In addition to cloud deployment on more traditional TV delivery systems, integration strategies become even more efficient in emerging OTT and IPTV platforms. In those systems, monitoring points can be virtually deployed in cooperation with CDNs and/or ISPs. In these cases, the integrator can rent space on a server and add the incremental costs to software SaaS.

As broadcasters change workflows and the way their facilities operate, it is abundantly clear that moving critical systems such as monitoring to the cloud is evolving the role of the systems integrator. However, that evolution will continue to require a core expertise of architecting efficient systems that make the right connections across the broadcast workflow, and simplify processes for end users.


Ted Korte is Qligent COO.

Ted Korte is Qligent COO.

You might also like...

Wi-Fi Gets Wider With Wi-Fi 7

The last 56k dialup modem I bought in 1998 cost more than double the price of a 28k modem, and the double bandwidth was worth the extra money. New Wi-Fi 7 devices are similarly premium-priced because early adaptation of leading-edge new technology…

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

NAB Show 2024 BEIT Sessions Part 1: ATSC 3.0 And TV RF

A full-time chief engineer in good relationships with manufacturer reps and an honest local dealer should spend most of their NAB Show time immersed in BEIT sessions. It’s an incredible opportunity to learn from and personally question indisputable industry e…

Essential Guide: Network Observability

This Essential Guide introduces and explores the concept of Network Observability. For any broadcast engineering team using IP networks and cloud ecosystems for live video production, it is an approach which could help combat a number of the inherent challenges…