The World Of OTT (Infrastructure Pt4) - Evolving CDNs To Improve OTT

To attract and retain an audience, OTT services must provide excellent customer experience by delivering content at the highest possible quality. Service must be smooth, uninterrupted and in the resolution required by the customer. This is one of the key ways OTT will continue to attract more viewers and fulfil its potential of being the main-stream method of media consumption.

CDNs – Content Delivery Networks – are at the heart of this drive for better quality. CDNs were born as the internet grew up, originally designed to manage the movement of files between connected servers. A CDN created a managed pathway for the most efficient delivery of content over the internet. In essence the CDN was the “high occupancy vehicle lane” of the internet – allowing traffic to travel a faster route because it was managed to allow it.

As video streaming emerged, the HTTP technology that was built for fast file exchange was adapted to accommodate streaming video. For VOD, this was fairly straightforward as it built on similar “file transfer” concepts. For Live video, the challenges have been more fundamental as HTTP was not originally conceived for frequently updated, low latency video delivery.

That said, latency and performance have reached very acceptable standards, largely matching the benchmarks set by satellite, cable and IPTV delivery methods. But that is only a small part of what the CDN is doing to assure an excellent OTT customer experience. This article looks at the CDN’s complete responsibility, and how they are evolving.

CDN Functions

The functions performed by the CDN are varied.

The first function is to manage ingress and egress of the video content, moving it from the Origin to the final ISP Network that transports it onwards to the IP-connected device. The CDN will contain at least one layer of Cache server – the Edge Cache – to perform this function. Often the CDN will contain multiple layers of Intermediate Cache servers that “buffer” content between the Origin and the Edge Caches.

Figure 1: A CDN contains layers of Caching servers, performing 7 core functions.

Figure 1: A CDN contains layers of Caching servers, performing 7 core functions.

Related to this, the second function is to initiate and manage streaming sessions. Whether live or VOD, each unicast stream must be monitored and managed. Every stream can be characterized by dozens of stream parameters such as IP address, device type and bitrate.

The third function is to store content. Storage varies from long-term storage of a VOD asset, like a film, to shorter-term storage of a catch-up TV asset, like yesterday’s soap opera episode. Storage also includes the short-term storage of live content that can be rewound, which is often held in memory rather than in storage. This multi-faceted storage function is why Cache storage is high-performance and (relatively) small capacity.

Related to storage, the fourth function is content management. Managing what stays on storage and what is deleted is a continuous process. Generally, Caches use a simple First In First Out model with oldest content being deleted first. However, as a pull system, if any item on the list is re-requested it will be delivered and then immediately moved to the top of the list. The management of content to be storage-efficient but also be bandwidth-efficient (i.e., minimize requests to the Origin) is a continuous process.

Load balancing is the fifth function. Normally this is performed across all the servers responsible for delivering the specific content. If, for example, ten Edge Caches are collectively delivering a live event to 100,000 people at an average bitrate of 5 mbps, the 500 Gbps of Cache egress should be balanced across servers. If each Edge Cache can stream 100 Gbps, then ideally they would each be 50% loaded. Naturally this rarely happens perfectly due to stream request location and network topology, but the Caches are continuously working to balance as evenly as possible. At a more granular level, a Cache server can also load-balance internally, typically across multiple CPUs. The Cache software should routinely balance CPU and Server for optimal performance.

The sixth function is platform monitoring. All hardware, software and network should be monitored to detect outages and degradation and then redirect stream requests appropriately. CDNs can be configured for servers to either seamlessly failover the streams they are carrying (by having matching IP addresses) or with a stream restart (by having unique IP addresses per server).

And finally, the seventh function is scale-out management. Expanding streaming and storage capacity involves the integration of new capacity to the CDN, typically while the CDN is in-service. Bringing new Caches online by adding them to the cluster or the overall CDN needs to be a seamless process to the overall OTT service that is being delivered.

CDN Architectures

There are two basic CDN models – public and private. A public CDN is generally multi-tenanted, where server and network capacity are shared. A private CDN is generally dedicated to one OTT operator, where server capacity and network port capacity are not shared.

Over the last ten years OTT Operators have largely used public CDNs. In this mode, many of the largest Operators have adopted the multi-CDN model where they balance their total traffic over multiple CDN service providers. This provides network resilience, pricing competition and often reach to specific markets where one CDN may have better presence than another.

The largest OTT Operators, most of which are VOD-centric, have already adopted the private CDN model. Today, the next wave of large OTT Operators, offering a combination of Live and VOD content, are embracing the private CDN model to improve their service latency and control.

But private CDNs are selected not only for improved performance and service control, but also to manage costs more effectively. Public CDNs are generally priced on a “Per GB of Output” basis, which is variable and increases in line with audience size, video bit-rate and consumption time. Private CDNs, on the other hand, are generally priced on a “Per Gbps of Throughput” basis, which provides cost certainty. To illustrate, if an OTT Operator delivers to 10,000 people at 5 Mbps average bitrate then they will egress at 50 Gbps from the CDN. If they do this for 1 hour, they will stream 36,000 GBs. If they do this for 2 hours, they will stream 72,000 GBs. The output is doubled, while the throughput is constant. When an OTT Operator has regular consumption or increasing consumption, the private CDN can offer big cost benefits versus the public CDN.

CDN Performance

One of the challenges for OTT Operators is that CDNs typically only offer service availability performance commitments, when OTT Operators want QoS (Quality of Service) and QoE (Quality of Experience) commitments. Given CDNs are normally multi-tenant environments, only part of the OTT Ecosystem, and they undergo enormous pressure to perform in the video use case, it is only natural that CDN service providers take this position. So how do CDNs look at performance?

First, capacity. Storage capacity is relatively simple and is typically measured in Terabytes (TB) of usable storage deployed across the relevant Cache servers. Streaming capacity is measured in Gbps (Gigabits per Second). The largest OTT Operators regularly stream in Tbps to their peak audiences. We should refer to this capacity as “throughput”. It is normal today for OTT Operators to talk about Petabytes of content delivered. A typical statement for an OTT Operator is to refer to Petabytes delivered in a day, week, month or year. Of course, this indicates audience size and consumption levels which are clearly what the OTT Operator cares about. But the CDN looks at its performance in terms of throughput because this is the measure of streaming capacity.

Second, quality. QoS typically refers to metrics like average bitrate and cache failover, while QoE includes metrics like rebuffering ratio, start-up time and viewing time at peak bitrate. As a rule, OTT Operators focus on QoE metrics because they reflect the viewer’s actual experience. The OTT Operator deploys software in every client to capture QoE metrics and uses this information to switch traffic between CDNs (if a multi-CDN is in place). Typically, the CDN is focused on QoS metrics, but leading CDNs are moving to QoE to meet their customers’ expectations (see the next part of this series). This is changing the balance from reactive client-side performance management to proactive CDN-side performance management.

In the next part of this series, we look at how Live and VOD workloads apply different pressure to CDNs, how Edge Computing works in OTT video, and the emergence of the Intelligent CDN.

You might also like...

TAG Introduces Realtime Media Platform

TAG Video Systems takes advantage of over 70,000 globally deployed probing points to give users the ability to dive deep into streaming content monitoring. The company anticipates more than 100,000 probing point deployments by the end of 2021.

The World Of OTT (Infrastructure Pt9) - Minimizing OTT Churn Rates Through Viewer Engagement

The basic goal is for consumers of video services to be highly engaged. It is easy to say but hard to do. Yet it is at the core of being a D2C streamer. D2C requires a deep understanding…

The Case For Adopting Both Client And Server Side Ad Insertion

The video streaming tide has been accelerated by the Covid-19 pandemic, but will continue to flow as relative normality returns, driving demand to monetize online content not just through subscriptions but also advertising.

Software IP Enabling Storytelling - Part 2

In the previous article in this two-part series we looked at how cloud systems are empowering storytellers to convey their message and communicate with viewers. In this article we investigate further the advantages for production and creative teams.

Integrating NDI And ST 2110 For Internet Streaming

The focus of much of the latest broadcast TV R&D is the Remote Integration Model (REMI). From millions of Skype meetings over consumer ISPs to the recent Winter Olympics TV broadcasts, REMI is significantly changing the internal dynamics…