Agile Monitoring Supports Growth Of OTT

To maintain high quality of experience for their customers, content providers need a way to monitor hundreds—sometimes thousands—of channels without compromising real-time error detection. In most cases, the immense scale of their service offerings makes continual visual monitoring of all streams physically impossible and error prone. To meet this need, the flexibility, scalability and agility of software-defined monitoring systems is applied to achieve unlimited multiviewer scaling and fully automated monitoring and alarming to meet this rapidly increasing need.

Traditional broadcast delivery is a linear flow with content being pushed downstream to TVs and set top boxes on the consumer side. The channel may come from OTA, cable or satellite, DTH or an IP network. The delivery format is singular, and resolutions are SD, HD or UHD. Typically, users view the content on their TV sets at home.

This mode of program delivery, however tried and true, has been rapidly changing. Viewers are increasingly watching content when they want, where they want and on the device they want. Therefore, content needs to be streamed via user requirements, on demand and at the resolution of a wide array of playback devices. Using an over-the-top (OTT) model, content is pulled by the consumers as they want it. To be successful going forward, linear broadcasters need to establish this direct connection to the viewer. These OTT platforms are the same as any other apps or services available on the internet, all you need is a device that supports OTT and an internet connection.

In the U.S. alone, 86% of smartphone users now watch video content on their phones, according to industry research firm Effectively monitoring a steadily increasing number of channels at a centralized NOC is critical to the success of OTT services, as these platforms must process and deliver multiple video formats to different display screen sizes (bit rates)—in a matter of milliseconds. Careful monitoring that can adapt to fast-changing infrastructure conditions helps ensure a good viewer experience. Traditional multiviewer-based monitoring could never handle the amount of channels and monitoring points managed by a single OTT infrastructure today.

Fortunately, as operational workflows across the media industry have evolved to look more like true IP workflows in IT environments, it has become evident that the presence of errors across the OTT delivery chain can be determined with these new software solutions that minimize the need for human intervention.

OTT Monitoring Brings Benefits And Challenges

As opposed to traditional video distribution methods, which operate under a dedicated and controlled network, OTT video uses an unmanaged IP network – the internet. The latter, while much more flexible in its implementation, can introduce all types of errors that the master control operator—or the monitoring software itself—must be aware of and detect.

There are other challenges with monitoring the increasingly large number of live feeds delivered by an OTT services that must be addressed.

  • These complex distributed processing infrastructures include many moving parts, some of which may be third-party, so they have to be looked at with an overall system view to ensure interoperability.
  • Using Adaptive Bit Rate (ABR) encoding, each feed now becomes 5-8 feeds, so the monitoring platform has to be able to scale (monitor more channels) quickly and accurately analyze each individual feed.
  • The system has to monitor not only transport, audio and video, but also closed captions/subtitles, SCTE triggers and other metadata services.

Monitoring Solutions Meet New Demands

Several OTT monitoring solutions have emerged in the market that offer two distinct and separate modes of monitoring, varying in visualization ability and resource consumption. There’s a Full Monitoring Mode, in which the input source is fully decoded in real time for probe analysis, alarming for errors and live video display on the multiviewer mosaic output; and a Light Mode, whereby the input source is probe analyzed and alarmed, but only partially decoded and therefore not displayed. One benefit of this mode is that it requires significantly less resources to operate.

Providing a comprehensive feature set of monitoring, detecting and alarming tools, these systems can now automatically analyze channels from ingest to delivery, are flexible enough to probe channels for information and they provide alarms to alert operators to changing configurations as they happen.

Some specific monitoring requirements include real-time streaming validation, disaster recovery, support for the common steaming formats and flexible multiviewing to accommodate a myriad of display configurations. It should also combine traditional MPEG-TS support with the unique monitoring requirements of OTT (including the ability to decrypt and decode the content in a pre-determined “secure zone”).

Other helpful features include high-density probing and multiviewing; the capability to monitor thousands of audio and video signals; numerous probing features and APIs for easy integration with network management systems. Better still, if the system itself is 100 percent software, by running on off-the-shelf hardware the user saves on capital investment while platform upgrades are performed remotely and in a timely fashion without disrupting operations. Of course, an IP-centric / software only product is also a perfect cloud fit.

Flexible Monitoring For A Fast-Changing World

More advanced systems now include Adaptive Monitoring that allows operators to monitor feeds in any of three operating modes—full, light, and extra-light monitoring. The software can automatically adapt between the monitoring modes to ensure optimal monitoring of all streams at all times (including full visualization of errors).

Whether performed on-premises or in the cloud, Adaptive Monitoring ensures that when a problem is detected, the channel is automatically switched to full monitoring mode. This allows the master control operator to stay on top of problems while monitoring an increasing number of channels. The dynamic nature of this model makes it an ideal solution for efficient high-density probing and monitoring of OTT channels because it gives operators greater agility in how they probe and monitor.

Adaptive Monitoring uses thresholds set by the operator within the system’s software or triggered by an API command from external devices monitoring the overall ecosystem. This brings operators significant efficiency gains that in turn yield cost savings and more extensive monitoring capabilities. It also allows operators to mix and match different monitoring modes. For example, instead of dedicating 100 percent of CPU power for full monitoring at one point, they can opt for light or extra-light monitoring and use a fraction of the resources. With this freedom to implement different monitoring modes within a single deployment, operators make the most of their available server resources.

A Path Forward

The goal for any network operations team is to detect problems and fix them before they become apparent to their viewers. Software-based monitoring systems are relatively easy to deploy (using existing CDN solutions) and, coupled with the ability to provide bi-directional services, are helping to meet the increased demands caused by the rapid growth of the OTT market.

As IP-based delivery of multi-channel TV to end customers gains in popularity, new OTT services like Disney+, Apple TV and Amazon TV will continue to use monitoring systems at their respective NOCs to help keep subscribers happy. They not only enable efficient monitoring across all channels, but also provide operations with a path forward for ongoing growth—both in terms of technology infrastructure and revenue.

Part of a series supported by

Broadcast Bridge Survey

You might also like...

Machine Learning For Broadcasters: Part 2 - Applications

In part one of this series, we looked at why machine learning, with particular emphasis on neural networks, is different to traditional methods of statistical based classification and prediction. In this article, we investigate some of the applications specific for…

Designing Media Supply Chains: Part 2 - Cloudification And Live Sports Drive Content Ingest

We continue our series of articles on design considerations for Media Supply Chains looking at the technology trends influencing Ingest.

Broadcast Audio System Design: Part 1 – Overview

We begin our series on things to consider when designing broadcast audio systems with the pivotal role audio plays in production and the key challenges audio presents.

Essential Guide: Mass Audience Broadcasting To Mobile With 5G Broadcast

Delivering high quality streamed live video to mass audiences is now possible with 5G broadcasting.

Machine Learning For Broadcasters: Part 1 - Overview

Machine Learning is generating a great deal of interest in the broadcast industry, and in this short series we cut through the marketing hype and discover what ML is, and what it isn’t, with particular emphasis on neural networks (N…