How Open Caching Aims To Support Broadcast Grade Streaming

Open Caching, a specification created by the Streaming Video Technology Alliance (SVTA), promises Content Providers a standardized CDN (Content Delivery Network) model that delivers a better end-customer QoE (Quality of Experience), and a possible way for ISPs (Internet Service Providers) to gain new operational efficiencies and earn new revenues from the delivery of OTT services.

These promises are some of the most important issues to resolve for D2C streaming video services to reach full “broadcast-scale”, according to some industry observers and participants. So how does Open Caching help to resolve them?

The grand vision for Open Caching is of an open and global CDN marketplace with CDN capacity provided directly inside ISP networks. Saying this another way, it is a way for ISPs and their CDN partners to serve Content Providers to achieve the same benefits from Edge Cache proximity to customers that Netflix, DAZN and other major streamers have already achieved for their own D2C streaming services through their own private CDNs.

The hype around Open Caching has grown as demand increases for broadcast-grade and broadcast-scale streaming. So, what is all the hype about? This article inspects the latest information about Open Caching and, with inputs from various industry leaders, looks at how it is expected to become a more important part of the Content Provider CDN ecosystem in future.

The Drivers Of Open Caching

The Open Caching initiative is founded on achieving three key objectives for video streaming:

1. Multi-CDN And Interoperability.  Content Providers have audiences that transcend individual CDNs and ISPs. Therefore, Content Providers naturally consider how to achieve the required combination of audience reach, service price, and service performance from their CDN suppliers. To achieve this, they have progressively migrated to a multi-CDN model. While multi-CDN brings certain benefits, like targeted reach for specific locations, price-competition between providers, and opportunities to improve service resiliency, it also brings extra complexity to a Content Provider because it involves managing multiple suppliers across a non-standardized set of interfaces. As streaming grows, the importance of standardization between CDNs also grows. Standardization can simplify operations for the Content Provider, in terms of supplier management and daily operational activities such as managing content across multiple caching environments.

2. Closeness To Consumers.  Streaming video generally benefits from being delivered closer to the consumer. A CDN platform deployed inside an ISP network creates the closest possible content delivery point to a consumer– imagine a location in a major city in any country. For people in or near that city this is typically the best place for a CDN Edge to serve a stream. In the future, as streaming quantities and peak throughputs grow, this location may need to be even more distributed – such as in smaller cities, where telephone exchanges host important broadband access network systems. This “deeper point” inside the ISP network is generally not where any CDN infrastructure is deployed today – we simply have not needed to do it yet, but things are set to change as D2C services grow.

For Content Providers, this closer-to-the-consumer positioning typically results in best possible quality and latency (i.e., best Quality of Experience or QoE), which are critical factors for the success of their services, particularly given the growing amount of live and linear content that is being streamed. For ISPs, a more deeply distributed Edge layer yields important benefits – it decongests their core network by shifting often many Tbps of throughput to deeper network locations, which means they can deliver higher quantities of streaming video while avoiding the expense of dramatically expanding their core network to handle it. This, in turn, leaves the core network capacity more available for all the other digital services that are growing at the same time as video streaming, such as IoT (Internet of Things), video conferencing, and security services.

It is important to note that “closeness” does not necessarily equal “better throughput”. Better throughput is really a function of “available capacity”. If capacity was unlimited, a stream could traverse an IP network to reach a consumer at its target bitrate without any problems. Capacity bottlenecks are what cause streaming performance issues. However, with that said, being closer to the consumer avoids the all-important network capacity bottleneck which is the ingress point into the ISP (i.e., the peering, transit, or interconnect points where today’s traditional CDN Edge interfaces to an ISP network), which is representative of the internal capacity available on the ISP’s core network.

3. ISP Revenue Generation From OTT.  Some ISPs offer pay-TV services, aggregating many content providers into a set of media packages for consumers. But many ISPs do not offer pay-TV services. Existing pay-TV services are under pressure from D2C streaming services that encourage “pay-TV cord-cutting”. At the same time, the level of video traffic delivered over ISP networks is increasing because of D2C video streaming, but the ISPs are not participating in the revenue generated from those services. In fact, in some countries, regulators are currently focused on creating a new industry balance by taxing the largest bandwidth users, like Netflix, to return some revenues from their above-average usage (i.e., the “fair share” argument). But simply put, as “closer to the consumer” becomes more important for QoE and network decongestion reasons, demand for and support for CDN capacity inside ISP networks grows.

ISPs are therefore very well-positioned to support the Media industry to provide high-bandwidth, high-quality, low-latency video services to consumers. It’s a win-win-win for the consumer, ISP and D2C Streamer. And as well as enabling this best possible QoE, ISPs can potentially replace their pay-TV profits with D2C streaming profits. For ISPs who do not have pay-TV services today, the D2C streaming trend offers them an opportunity to generate new revenues from their network infrastructure.

The Technical Specification Headlines

Open Caching is a technical specification – not a standard – that enables CDNs to interoperate with other CDNs within an Open Caching network. This provides the necessary building block for Content Providers to have simpler and more efficient standardized interfaces across multiple CDNs, and for ISPs to build out capacity that can be uniformly and easily used by the Content Providers. For Content Providers, Open Caching should be thought of as an overlay across multiple CDN domains (private CDN, public CDN, ISP-CDN). It could provide Content Providers with a way to manage their content delivery and their content in a consistent way, regardless of which CDN infrastructure is physically being used.

While Open Caching is positioned as a performance and operational benefit to Content Providers, the main financial beneficiaries are anticipated to be ISPs and their CDN partners who would carry the bulk of the content delivery responsibility. The commercial aspects of Open Caching are still in the early phases of being market tested, and will form the basis of a future article.

Some major industry players are behind the Open Caching initiative, including ISPs (e.g., BT, Orange, Telefonica, Verizon, Viasat), CDNs (e.g., Edgio, Fastly, Qwilt, Stackpath), technology vendors (e.g., ATEME, Broadpeak, Vecima, Velocix), and championed by what is currently a small set of content providers including Disney, Globo, Hulu, and Paramount. Notably, some major industry players not involved in the Open Caching initiative include Akamai and Microsoft.

The Open Caching (OC) technical specification covers three principal components:

1.  Open Caching Node (OCN) – this is a server and its software that are deployed at the CDN Edge, whether that is inside the ISP network or peering with an ISP. OC specs can be used either to build an entire OC system, or to implement OC-compliant software on an existing CDN platform. The full vision for the OC specification that has ISP-owned Caches deployed inside ISP networks (i.e., the ISP’s OC-CDN) considers the OCN to be the final Cache in a multi-layered CDN architecture.

2.  Open Caching Controller (OCC) – this software component allows content providers, CDNs, and ISPs to interact with the OC platform. It is the management component for delegations, configurations, content purging and pre-positioning, logging, and security. It must be deployed on a Cache prior to the OCN, not at the Origin, per Figure 1 below.

3.  Request Router (RR) – this is the routing logic that directs viewers towards the appropriate OCN to deliver the content. There are three methods available in the OC specification – DNS, HTTP and Manifest. The RR logic exists at multiple layers of the CDN architecture, from the CDN Selector to the ISP’s OC-CDN.

Figure 1 – Open Caching architecture (source, Streaming Video Alliance).

Figure 1 – Open Caching architecture (source, Streaming Video Alliance).

 With these 3 components, an API-based Open Caching solution can be constructed for a Content Provider, to which they can connect their Origin and CDN Selector systems. At that point, an Open Caching CDN will form part of a multi-CDN ecosystem. But in contrast to a public CDN service, albeit similarly to a private CDN or a proprietary ISP CDN, the ISP’s OC-CDN introduces “on-net” capacity and its associated QoE benefits.

But a point of clarification – being on-net (i.e., closer to the consumer) does not automatically mean perfect performance. There are still technical elements to consider in how a CDN service is architected (e.g., private dedicated capacity or multi-tenanted) and in how a CDN performs based on its software design (e.g., start-up times, session management, origin connectivity methodology). As an initiative, Open Caching is focused on these points, but so are other CDN suppliers that deploy CDN capacity inside ISP networks using their proprietary platforms.

Open Caching History & Vision

Open Caching began in 2014 and was the foundation stone of the Streaming Video Technology Alliance. This initiative has a clear ambition to improve the world of video streaming, and it has come a long way in the last few years. As broadcast-grade and broadcast-scale become hotter buttons for D2C streaming services, Open Caching could play a major part in delivering those requirements.

As Jason Thibeault, Executive Director of the SVTA states: “The primary strengths of Open Caching are its openness and the fact it is API-based. This allows network operators, CDNs, and Content Providers to easily implement it through software upgrades to existing CDN infrastructure, rather than hardware changes. In addition, Content Providers can benefit from the single control-plane that allows for management of all caches across different delivery networks, while network operators can benefit from delivering popular content from an Edge deployed in their network that reduces backhaul network utilization.”

In the long-run the vision for Open Caching is that all CDN platforms will be OC-compliant and therefore enable a Content Provider to connect to all CDNs in a standardized way, to manage their unified CDN and their content through a single control plane.

But almost a decade has passed since Open Caching was launched as a working group of the Streaming Video Technology Alliance, and deployment is still relatively sparse and interest within the industry appears relatively low. So, now that D2C streaming is firmly fixed in the latest strategies, plans and activities of Broadcasters all over the world, are we at the coming-of-age moment for Open Caching?

Open Caching Deployment

Open Caching has 3 primary components – Open Caching Node, Open Caching Controller, and Request Routing. These components have been defined and are partially specified by the SVTA Open Caching Working Group. To deploy an Open Caching (OC) solution in its fully envisaged form requires the deployment of these OC-compliant software components in all layers of the CDN architecture. For many Content Providers that are using public CDNs today, this means that those CDNs must first implement OC-compliant software (or the Content Provider must build/buy a new OC-compliant CDN). Then the ISP CDNs that are OC-compliant (we refer to them as “ISP OC-CDNs” in this article) can be delegated to, enabling the onward delivery – and critically, the management –of the video streams.

This multi-CDN delegation approach is being worked out at a technical level.

How it works commercially is another matter. Some of the major public CDN providers are not part of the SVTAOpen Caching Working Group – most notably Akamai and Microsoft. This is understandable given that once a stream is delegated to an OC-CDN, then all consumer-level egress from that point onwards is managed by the ISP’s OC-CDN. The business model of “Pay per GB” or even “Pay per Gbps” – with a cache hit rate of 95%+ - could mean somewhere between 90-95% revenue reduction for the Public CDN if they hand off to an OC-CDN provided by an ISP and their CDN provider. This is a significant commercial problem to overcome and there is not yet an obvious solution.

Jason Thibeault comments: “Adoption will not be driven by a single event. As network operators continue to implement open caching deployments, more companies will adopt it. But, ultimately, the pressure to adopt will come from the Content Providers or their platform operators that are looking for a better way to holistically manage all their caches across all of their delivery networks. This will probably play a key role in moving CDNs towards being Open Caching compliant.”

Without the solid commercial footing, Open Caching specs, as defined, are not able to be validated in full production mode because there isn’t yet a full multi-layered CDN ecosystem in which to validate them (however, to support development-stage interoperability testing, the SVTA provides an API testbed to its member companies). Instead, pioneering solutions are being built today by leading OC proponents, such as Verizon, Qwilt and Velocix, for specific Content Providers with their own private OC-compliant CDNs, like Disney. Recent headlines from large ISPs including J:COM, British Telecom and Telecom Argentina highlight the deployment of the Qwilt solution that is based on Open Caching specifications. Other sources explain that recently deployed ISP OC-CDNs are now being promoted more widely by those ISPs to the Content Provider market.

But these ISP OC-CDN deployments are just a start. Now the upstream CDNs – either owned by the Content Provider or available on commercial CDN services – need to implement the OC Controller component to complete the Open Caching ecosystem.

Performance Testing

Early field tests for an Open Caching platform, reported by Verizon at NAB 2022, showed that QoE performance KPIs all improved over the base case – including start-up time, sustained bitrate, and peak bitrate.

These results align well with reports from leading D2C Streamers that are using Private CDNs deployed inside ISP networks. The improved results stem mostly from the placement of Edge Caches inside the ISP’s network, which improves performance by avoiding congestion with other internet traffic at peering points and inside ISP core networks.

But other factors can also drive performance improvements, such as the degree to which the Edge platform is managed to assure there is sufficient capacity to support 100% of the demand. For example, if capacity is allocated to a single Content Provider and the total platform is planned to provide sufficient excess capacity, then performance will be higher than on a multi-tenant platform without any excess capacity. Running “hot” against available capacity is always a risky strategy for high-performance, latency sensitive applications like live video streaming. Open Caching specifications provide ways for capacity to be understood before allocating traffic to an Open Caching Node (OCN), but if demand exceeds capacity overall, then QoE problems will persist.

Open Capacity Management

Performance is critical for video streaming. And as noted, performance is tied mostly to capacity availability. The Open Caching specification for Open Capacity Management is potentially its killer app.

Open Caching envisages global, democratized, unified CDN capacity. This is a grand vision of easily accessible streaming capacity for the world’s streaming video businesses that are looking towards a future where most content will be consumed via IP over fixed and mobile broadband networks, and at higher resolutions / bandwidths than today. In this vision the capacity usage interfaces and operations would be standardised by OC specifications. Also, capacity availability would be made visible through the same standard interfaces, allowing Content Providers to understand and buy capacity from any OC CDN supplier to meet their needs. In short, a dynamic map of global OC-CDN capacity could become visible by OC-CDNs being widely deployed. This idea would democratize access to CDN capacity and would simplify the technical operation of a multi-CDN platform that spans multiple ISPs to achieve audience reach.

This model could potentially be the key differentiator for Open Caching as Content Providers consider their CDN strategies. Today, larger Content Providers access CDN capacity through a range of relationships, often with multiple CDN providers, mostly direct with public CDN providers or via a CDN aggregator. Increasingly, private CDNs are being deployed or considered in RFIs and RFPs by larger Content Providers, like national broadcasters. Smaller Content Providers typically rely on an OVP (Online Video Platform) provider, that bundles CDN services into their content management and content origination services. Open Caching would introduce CDN services from the ISPs instead, although they would still need support from a range of established CDN service providers or CDN technology providers. If ISPs in a single country broadly deploy Open Caching platforms, then Content Providers with large national audiences would be able to consider those platforms for large percentages of their audiences.

So, can this work? Technically, yes, the specifications are emerging now. And as Jason Thibeault, explains, “Open Caching is an overlay on existing CDN infrastructure, so it can exist as an option alongside other CDN services and technologies. The big difference is that in Open Caching, the Content Provider or their CDN operator can directly manage the Open Caching Nodes, which they are typically unable to do in a proprietary CDN environment.”

So, two large questions remain: 1) will ISPs implement Open Caching platforms and offer CDN services? and 2) does Open Caching result in a better set of commercial relationships for Content Providers with CDN suppliers? These questions will be explored in a separate article about the commercial aspects of Open Caching, and what this means for broad industry adoption.

CDN Selection

Larger Content Providers with regional, national and international audiences need to think about multi-CDN strategies, generally because no single supplier will have the right capacity to support all the demand. Therefore, the CDN selection capability is important to choose the best CDN to utilise based on a mix of price and performance parameters. In a multi-CDN environment that has an OC-CDN within it, the Request Router (RR) component will first select the CDN to deliver the content. Once the stream has moved inside that OC-CDN, the local RR component will select the Edge Cache for stream delivery to the client.

If there are two layers of CDN (e.g., a public CDN interfacing with an ISP OC-CDN), then the RR component of the first CDN layer (i.e., the public CDN) will select the ISP OC-CDN after understanding that it has available capacity and appropriate performance. Once selected, the ISP OC-CDN platform will own the stream and its future session management control. Within the ISP OC-CDN, the RR component will choose which specific OCN to serve the stream from. At that point, all communication between the Client and the CDN is between the Client and the OCN.

CDN selector tools, including those integrated with client-side analytics, would remain the overall controller of which CDNs serve which end customers. The Open Caching Request Router sits underneath a CDN selector in the decision-making hierarchy. It is therefore possible that an OC-CDN could be blacklisted for poor performance, like any other CDN. To be re-activated by the CDN selector, the OC-CDN would republish information about its available capacity and performance levels via the Footprint and Capabilities API, to demonstrate it is ready to receive new traffic.


Open Caching at its core is about interoperability for metadata, capacity management, location management, and logging, which today are managed by CDN’s proprietary algorithms. The rest of Open Caching is about HTTP routing, which all CDNs apply using approaches that include eDNS0, DNS resolver, Anycast, and HTTP redirect.

Open Caching adds a fifth interoperability category of transparent stream management between multiple OC-CDNs. But as with any specification or standard, this does not mean a seamless integration between CDNs because specification implementations can vary. Real-world technical testing and ongoing alignment of specification implementation will be necessary across all suppliers of OC-CDNs for full interoperability.

CDNs deployed inside ISPs today that are already capable of operating with multiple types of Origins generally have the preliminary building blocks for implementing Open Caching. There are various ways to upgrade these CDNs to implement the Open Caching compliant software. Jaya Devabhaktuni, Architect at Velocix and a contributor to Open Caching specifications, states, “The approach to an upgrade depends on how each CDN vendor chooses to implement the OCC and OCN functions – they could be standalone software modules, plugins, or enhancements to existing control and data planes. The main points are that the control plane must support the OCC capabilities, and the data plane must implement the OCN capabilities. These OC capabilities could be deployed as an overlay on top of an existing Edge Cache, or they could be deployed on new standalone OC nodes within the network operator.”


Today, there are a range of CDN models that Content Providers can consider today, as shown in Figure 2.

The Public CDN connected directly to an ISP without any on-net CDN capacity has been the normal model for over a decade and continues to be the dominant model today for broadcasters’ D2C services. This appears likely to diverge to include one or more of the following 3 models based on the scaling up of the broadcasters’ streaming volumes.

Figure 2 – Multi-CDN architecture including Public CDN, CP (Content Provider) Private CDN and Open Caching CDN, showing where OC Controllers (OCC) and OC Nodes (OCN) will be deployed.

Figure 2 – Multi-CDN architecture including Public CDN, CP (Content Provider) Private CDN and Open Caching CDN, showing where OC Controllers (OCC) and OC Nodes (OCN) will be deployed.

The leading global streamers have driven the deployment of the CP Private CDN model, deployed at Internet Exchanges and often deep inside ISP networks.

And as Content Provider apps are integrated into existing pay-TV services which already use an on-net CDN, some of the Content Provider traffic traverses the ISPs’ existing proprietary CDNs.

After 9 years, Open Caching is bringing a new model into the mix, which we see today primarily in the form of point-to-point solutions between Content Providers supporting Open Caching and some ISPs that are deploying OC-CDNs. The full vision of all public/private CDNs and all ISPs deploying OC-compliant CDNs is still a long-term vision for now, and indeed the path to achieve the full vision is not yet clear given the various stakeholders with different commercial interests.

So, for now, the future contains a mix of CDN models. But will the future of D2C streaming in a multi-CDN architecture be simplified because of Open Caching? Technically it should be, but it depends most heavily on the commercial and deployment realities. The technical vision is close to reality after 9 years of specification-writing effort. It is not clear how long it will take to achieve the full operational and commercial vision of an open caching environment, but momentum appears to be building.

You might also like...

The Big Guide To OTT: Part 10 - Monetization & ROI

Part 10 of The Big Guide To OTT features four articles which tackle the key topic of how to monetize OTT content. The articles discuss addressable advertising, (re)bundling, sports fan engagement and content piracy.

Video Quality: Part 2 - Streaming Video Quality Progress

We continue our mini-series about Video Quality, with a discussion of the challenges of streaming video quality. Despite vast improvements, continued proliferation in video streaming, coupled with ever rising consumer expectations, means that meeting quality demands is almost like an…

2024 BEITC Update: ATSC 3.0 Broadcast Positioning Systems

Move over, WWV and GPS. New information about Broadcast Positioning Systems presented at BEITC 2024 provides insight into work on a crucial, common view OTA, highly precision, public time reference that ATSC 3.0 broadcasters can easily provide.

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

The Streaming Tsunami: Securing Universal Service Delivery For Public Service Broadcasters (Part 3)

Like all Media companies, Public Service Broadcasters (PSBs) have three core activities to focus on: producing content, distributing content, and understanding (i.e., to monetize) content consumption. In these areas, where are the best opportunities for intra-PSB collaboration as we…