To attract and retain an audience, OTT services must provide excellent customer experience by delivering content at the highest possible quality. Service must be smooth, uninterrupted and in the resolution required by the customer. This is one of the key ways OTT will continue to attract more viewers and fulfil its potential of being the main-stream method of media consumption.
CDNs – Content Delivery Networks – are at the heart of this drive for better quality. CDNs were born as the internet grew up, originally designed to manage the movement of files between connected servers. A CDN created a managed pathway for the most efficient delivery of content over the internet. In essence the CDN was the “high occupancy vehicle lane” of the internet – allowing traffic to travel a faster route because it was managed to allow it.
As video streaming emerged, the HTTP technology that was built for fast file exchange was adapted to accommodate streaming video. For VOD, this was fairly straightforward as it built on similar “file transfer” concepts. For Live video, the challenges have been more fundamental as HTTP was not originally conceived for frequently updated, low latency video delivery.
That said, latency and performance have reached very acceptable standards, largely matching the benchmarks set by satellite, cable and IPTV delivery methods. But that is only a small part of what the CDN is doing to assure an excellent OTT customer experience. This article looks at the CDN’s complete responsibility, and how they are evolving.
The functions performed by the CDN are varied.
The first function is to manage ingress and egress of the video content, moving it from the Origin to the final ISP Network that transports it onwards to the IP-connected device. The CDN will contain at least one layer of Cache server – the Edge Cache – to perform this function. Often the CDN will contain multiple layers of Intermediate Cache servers that “buffer” content between the Origin and the Edge Caches.
Related to this, the second function is to initiate and manage streaming sessions. Whether live or VOD, each unicast stream must be monitored and managed. Every stream can be characterized by dozens of stream parameters such as IP address, device type and bitrate.
The third function is to store content. Storage varies from long-term storage of a VOD asset, like a film, to shorter-term storage of a catch-up TV asset, like yesterday’s soap opera episode. Storage also includes the short-term storage of live content that can be rewound, which is often held in memory rather than in storage. This multi-faceted storage function is why Cache storage is high-performance and (relatively) small capacity.
Related to storage, the fourth function is content management. Managing what stays on storage and what is deleted is a continuous process. Generally, Caches use a simple First In First Out model with oldest content being deleted first. However, as a pull system, if any item on the list is re-requested it will be delivered and then immediately moved to the top of the list. The management of content to be storage-efficient but also be bandwidth-efficient (i.e., minimize requests to the Origin) is a continuous process.
Load balancing is the fifth function. Normally this is performed across all the servers responsible for delivering the specific content. If, for example, ten Edge Caches are collectively delivering a live event to 100,000 people at an average bitrate of 5 mbps, the 500 Gbps of Cache egress should be balanced across servers. If each Edge Cache can stream 100 Gbps, then ideally they would each be 50% loaded. Naturally this rarely happens perfectly due to stream request location and network topology, but the Caches are continuously working to balance as evenly as possible. At a more granular level, a Cache server can also load-balance internally, typically across multiple CPUs. The Cache software should routinely balance CPU and Server for optimal performance.
The sixth function is platform monitoring. All hardware, software and network should be monitored to detect outages and degradation and then redirect stream requests appropriately. CDNs can be configured for servers to either seamlessly failover the streams they are carrying (by having matching IP addresses) or with a stream restart (by having unique IP addresses per server).
And finally, the seventh function is scale-out management. Expanding streaming and storage capacity involves the integration of new capacity to the CDN, typically while the CDN is in-service. Bringing new Caches online by adding them to the cluster or the overall CDN needs to be a seamless process to the overall OTT service that is being delivered.
There are two basic CDN models – public and private. A public CDN is generally multi-tenanted, where server and network capacity are shared. A private CDN is generally dedicated to one OTT operator, where server capacity and network port capacity are not shared.
Over the last ten years OTT Operators have largely used public CDNs. In this mode, many of the largest Operators have adopted the multi-CDN model where they balance their total traffic over multiple CDN service providers. This provides network resilience, pricing competition and often reach to specific markets where one CDN may have better presence than another.
The largest OTT Operators, most of which are VOD-centric, have already adopted the private CDN model. Today, the next wave of large OTT Operators, offering a combination of Live and VOD content, are embracing the private CDN model to improve their service latency and control.
But private CDNs are selected not only for improved performance and service control, but also to manage costs more effectively. Public CDNs are generally priced on a “Per GB of Output” basis, which is variable and increases in line with audience size, video bit-rate and consumption time. Private CDNs, on the other hand, are generally priced on a “Per Gbps of Throughput” basis, which provides cost certainty. To illustrate, if an OTT Operator delivers to 10,000 people at 5 Mbps average bitrate then they will egress at 50 Gbps from the CDN. If they do this for 1 hour, they will stream 36,000 GBs. If they do this for 2 hours, they will stream 72,000 GBs. The output is doubled, while the throughput is constant. When an OTT Operator has regular consumption or increasing consumption, the private CDN can offer big cost benefits versus the public CDN.
One of the challenges for OTT Operators is that CDNs typically only offer service availability performance commitments, when OTT Operators want QoS (Quality of Service) and QoE (Quality of Experience) commitments. Given CDNs are normally multi-tenant environments, only part of the OTT Ecosystem, and they undergo enormous pressure to perform in the video use case, it is only natural that CDN service providers take this position. So how do CDNs look at performance?
First, capacity. Storage capacity is relatively simple and is typically measured in Terabytes (TB) of usable storage deployed across the relevant Cache servers. Streaming capacity is measured in Gbps (Gigabits per Second). The largest OTT Operators regularly stream in Tbps to their peak audiences. We should refer to this capacity as “throughput”. It is normal today for OTT Operators to talk about Petabytes of content delivered. A typical statement for an OTT Operator is to refer to Petabytes delivered in a day, week, month or year. Of course, this indicates audience size and consumption levels which are clearly what the OTT Operator cares about. But the CDN looks at its performance in terms of throughput because this is the measure of streaming capacity.
Second, quality. QoS typically refers to metrics like average bitrate and cache failover, while QoE includes metrics like rebuffering ratio, start-up time and viewing time at peak bitrate. As a rule, OTT Operators focus on QoE metrics because they reflect the viewer’s actual experience. The OTT Operator deploys software in every client to capture QoE metrics and uses this information to switch traffic between CDNs (if a multi-CDN is in place). Typically, the CDN is focused on QoS metrics, but leading CDNs are moving to QoE to meet their customers’ expectations (see the next part of this series). This is changing the balance from reactive client-side performance management to proactive CDN-side performance management.
In the next part of this series, we look at how Live and VOD workloads apply different pressure to CDNs, how Edge Computing works in OTT video, and the emergence of the Intelligent CDN.
You might also like...
Covid-19 may have changed the course of broadcasting but has not slowed its development, judging from NAB 2022, the first major industry show with a physical presence since before the pandemic.
Most national broadcasters in developed countries have app-based OTT services, many of which have been in place for over a decade. Less-developed national broadcasters still rely on YouTube, Social Media platforms, or their own websites to deliver OTT content to…
With fewer exhibits and smaller crowds, the 2022 NAB Show aisles were easier to navigate and exhibitors had more time to speak with visitors.
Many annual NAB Shows have become milestones in TV broadcasting history. The presence of the 2022 NAB Show marked the first Las Vegas NAB Show since 2019.
As specialist broadcast service providers and network operators look at the fast growing requirement for at-scale broadcast-grade streaming video, it is a big opportunity for them to address.