Advanced File-based QC and Monitoring Key to Monetizing OTT Content, Superior QoE on Every Screen

OTT video consumption is growing rapidly. Recent research from Ericsson found that the average weekly amount of time spent watching OTT content increased from 3.6 hours per week in 2014 to 12.1 hours per week in 2017. Broadcasters have an opportunity with OTT to boost their revenue, as the global OTT market is expected to grow to $158.4 billion by 2025. However, OTT workflows are different and can be more complex than traditional broadcast. Ensuring a high quality of experience is critical. Today’s viewers expect video and audio to be flawless on every screen, including TVs, PCs, smartphones, and tablets. This article will examine the complexities of the OTT world and explain why having a strong file-based QC and monitoring solution is essential for delivering OTT content. In addition, it will discuss the key capabilities to look for in a QC and monitoring solution to ensure broadcasters make a smooth transition to the OTT environment.

Anupama Anantharaman is VP of product marketing at Interra Systems.

Anupama Anantharaman is VP of product marketing at Interra Systems.

Complexities of OTT

OTT delivery is quite different than traditional broadcast. In order to cater to wide range of devices with multiple screen sizes, OTT content needs to be transcoded to multiple delivery formats, including HLS, DASH, MSS, and HDS. Moreover, OTT content must be encrypted with various DRM protection schemes, such as Microsoft PlayReady, Google Widevine, and Apple FairPlay. With so many variations to maintain, broadcasters are dealing with a massive amount of content. (See Figure 1.)

Figure 1. A typical OTT workflow.

Figure 1. A typical OTT workflow.

Having a large volume of content increases the chance for errors. Some of the common issues that can impact streaming quality of experience include poor video quality caused by over-compression during content preparation, profile alignment issues causing glitches in playback when the player switches bitrates, encryption-related issues, and server-related problems such as HTTP failures caused by client or server errors. Issues can also occur during the delivery phase, causing long join times or frequent stalling and switching during playback.

Since OTT content is delivered over the internet, which is unmanaged, the quality and bandwidth of the delivery changes based on network congestion. Broadcasters have minimal or no control during the last mile delivery for OTT content, so quality is never guaranteed. In addition, content protection becomes more important since data flows on a public network. Finally, with multiple stakeholders involved in the delivery chain (CDNs, ISPs etc.) as well as an evolving technology, it can be hard to identify and resolve QoS issues.

Establishing an Effective File-based QC and Monitoring Workflow for Monetization

With the increase in OTT consumption, VOD services are trending nowadays, andviewers are streaming VOD content more than streaming live programming.

Viewers expect the best quality of experience (QoE), especially when it comes to paid subscriptions and at least the same if not better quality, when compared to traditional broadcasts. With cutthroat competition in place and sophisticated viewing habits, OTT providers that ignore QoE will find it difficult to retain subscribers. QoE ensures that viewers are happy while also providing meaningful insights about what is being watched, peak viewing times, average bitrates delivered and so on, which helps broadcasters to discover new monetization opportunities and optimizes video delivery.

Thus, it has become imperative to check that all file-based content is prepared and delivered accurately. Having a reliable and efficient file-based QC is as critical as monitoring to support both VOD and live streams.

OTT delivery is continuum to the file-based content preparation workflows and the good news is that the latest OTT QC and monitoring solutions are well-suited to address file-based content preparation and distribution workflows. Deploying an OTT monitoring solution that works in tandem with a file-based QC tool will allow broadcasters to quickly and correctly address any issues, all the way from ingest to delivery.

Service providersneed to check and compare content quality at ingest and post-transcode to ensure that only best quality flows downstream. Similarly, verifying the content for ABR compliance during content preparation ensures that ABR-ready content is published on the servers.

Service providers also need to perform monitoring at multiple points during content preparation, as well as during content delivery, by using active and passive monitoring techniques in a complementary manner.

Active monitoring refers to monitoring user experience by emulating synthetic clients. It is a pro-active approach that helps in early fault detection and resolution, even before the user is impacted. Passive monitoring refers to analyzing each and every request originating from real clients for a statistical measurement of user experience. Being a reactive approach, it gives comprehensive information about streams being played only when viewers are watching content and only for assets being watched. It requires physical access to the servers and end devices being monitored. Effective QC during content preparation and monitoring at relevant points in the workflow can help in controlling and resolving errors that impact content quality.

A more QoS focused checking can be performed at different points in a CDN to make sure that no new issues have been introduced during content replication. Monitoring different locations within a CDN can also help identify issues that could be specific to certain geos. In addition, monitoring playback experience on end-devices can help broadcasters assess the real quality of experience, analyze results and improve video delivery. (See Figure 2.)

Figure 2. Active/passive QC and monitoring points in the OTT workflow.

Figure 2. Active/passive QC and monitoring points in the OTT workflow.

The Value of Putting QC and Monitoring on the Cloud

Launching OTT services requires an investment in content and infrastructure. As broadcasters look to save costs, adopting a cloud-based OTT workflow has emerged as an effective way to minimize expenses. Relying on a software-based QC and monitoring solution that is run on the public cloud, broadcasters can scale their services based on evolving needs. Moreover, cloud- services are based on an opex business model, whereby broadcasters only pay for what they use. Services can be launched more quickly, and additional resources are not required to maintain and manage IT infrastructures. The cloud-based provider handles day-to-day infrastructure-related tasks.

Moving from hardware infrastructure toward software-based solutions, including QC and monitoring, broadcasters can dynamically adjust the scope of their technology platforms to match the size of their OTT offering. They need not worry about how many licenses are required.

Conclusion

To be able to effectively monetize OTT services, broadcasters need a file-based QC and monitoring approach that offers active and passive techniques. While many QC and monitoring solutions provide alarms related to quality issues, having end-to-end visibility into the OTT workflow (from ingest to delivery) is critical for isolating faults. By collecting data from multiple points, stitching it all together and building a complete end-to-end picture with meaningful insights, file-based QC and monitoring solutions can help broadcasters be proactive in addressing OTT quality issues. Choosing a software-based, cloud-ready solution will enable them to focus on building content and devoting resources toward tasks that will drive toward new revenue. 

You might also like...

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Network Orchestration And Monitoring At NAB 2024

Sophisticated IP infrastructure requires software layers to facilitate network & infrastructure planning, orchestration, and monitoring and there will be plenty in this area to see at the 2024 NAB Show.

Encoding & Transport For Remote Contribution At NAB 2024

As broadcasters embrace remote production workflows the technology required to compress, encode and reliably transport streams from the venue to the network operation center or the cloud become key, and there will be plenty of new developments and sources of…

Standards: Part 7 - ST 2110 - A Review Of The Current Standard

Of all of the broadcast standards it is perhaps SMPTE ST 2110 which has had the greatest impact on production & distribution infrastructure in recent years, but much has changed since it’s 2017 release.

6 of 7. See more