The Streaming Tsunami: Part 1 - Seeing The Tsunami Coming

Streaming video is on the cusp of becoming a major problem for broadband networks. Up to now we have been dealing with a swell in the streaming sea that has caused a few large waves to crash on to the shores of broadband networks. These waves have made headlines in the press because they caused high-profile failures in the delivery of high-quality video experiences. But we are about to see a huge Tsunami emerge as broadcasters finally make a big shift towards streaming-first.

Consider a few key points from industry headlines over the last few years. Broadband speeds to the home are above 100 Mbps in many countries and fiber-to-the-premises (FTTP) roll-out programmes are underway. COVID-19 caused an acceleration in the consumer adoption of digital services. Streamers like Netflix, Amazon Prime, and Disney+ have proven their popularity and placed SVOD firmly on our entertainment map. Ad-supported streaming is growing quickly as people look to free content services to help balance household budgets. Traditional pay-TV subscription numbers are declining. And broadcasters, as champions of local content and with generally the biggest TV audiences and longest hours of consumption on a daily basis, are focusing more and more on streaming-first business strategies.

Broadcasters will appear on the horizon as a Tsunami racing towards Broadband networks. I personally remember standing in a Network Operations Center at British Telecom the day that BBC iPlayer was launched in 2008, watching network utilization hotspots show up all across the UK map as people “pressed play”. I remember being impressed at how video supply and consumer demand could combine to impact the delivery networks so heavily. It’s not an unusual phenomenon – any supply-side environment must deal with this problem, from electricity supply to semiconductor manufacturing to call center services. But it’s not been a problem for broadcasting unless there has been a shift from SD to HD, or HD to UHD. But with streaming, the network type changes the dynamics.

Fast forward 15 years from the iPlayer launch, and we could soon be hearing about more and more network hotspots causing viewing problems for consumers. Broadcasters’ streaming volumes are growing at about 18-20% year on year according to most industry tender documents, while peaks are often serving hundreds of Gbps or several Tbps depending on the size of the country. Special event peaks, like the FIFA World Cup or the Super Bowl, reach past these normal streaming peaks. Where we see live sports in streaming-only situations, like Serie A football in Italy with DAZN, we have 3-4 million people watching simultaneously taking us into the tens of Tbps of streaming capacity required. At the same time, only about 10% of total broadcaster viewing consumption is through their OTT apps.

If we look forward just 5-10 years, we could see a world where 20 million devices in a country of 60 million people (like the UK, for example) are streaming video during the evening every single day – these are our typical prime-time viewing numbers, although not accounting for population growth. And if we speculate that average video quality will probably be higher than today, and that most people will be watching video on their big-screen TVs in their living rooms, then we could assume a doubling of the average bitrate requested. And if we ask ourselves about the incredible broadband bandwidth in our homes, then why shouldn’t we expect to receive very reliable 10 Mbps video into our homes, even across 3-4 devices simultaneously if we want to?

It's a very fair question from a consumer perspective. Access networks are growing towards Gigabit speeds for each home. Mobile networks are moving to 5G and already planning for 6G. TV manufacturers enable us to have incredible viewing experiences on impressive screens. Consumers are consuming more and more content on internet-connected devices that uses the broadband bandwidth available to them. Of course this all increases streaming demand, but why is there a destructive Tsunami on the way?

The problem is in the network between the Content Provider and the Consumer.

Where the Tsunami will strike

The Content Provider originates content in the form of live streams, linear streams or VOD files. When a consumer presses Play, that content travels from the point of origination to the broadband network, often with the help of cloud platforms and Content Delivery Networks (CDNs). These cloud platforms and CDNs interface with internet service providers (ISPs) to hand over the content as it passes at light speed through fiber-optic networks towards the consumer.

The interface point is the edge of the CDN – an Edge Cache or Edge Server – which connects to the ISP’s network through shared transit connections provided by B2B ISPs and Internet Exchange Providers, or via direct peering between the CDN and the ISP, or via direct on-net connection points provided inside the ISP’s own network domain.

ISPs serve consumers with broadband services, but often there are independent Access Network Operators that are responsible for the connection from the ISP’s network (the “Core Network”) to the consumer (what is often called the “last mile network”). When we talk about average broadband speeds to the home of over 100Mbps, and when we talk about FTTP roll-out to move us to Gigabit speeds to the home, we are generally talking about what the Access Network is capable of. This is separate to the ISP Core Network that transports content from the CDN Edge to the Access Network.

If we just let “nature take its course” the Streaming Tsunami will flood the Core Network. Consumers will be frustrated by poor viewing experiences. Media companies will lose subscribers and revenues. ISPs will have network congestion issues, affecting other users of their network. Front-page news headlines will persist.

Figure 1 below explains the demand and supply situation we normally have today, resulting in successful streaming delivery to most viewers. It is not a perfect experience for everyone as video traverses the multi-purpose ISP Core Networks and Access Networks, but it generally works (even if “works” includes live events with 30-60 second delays built into the buffer that is designed to protect viewers from inconsistent network performance). Figure 2 shows what the biggest streaming events cause when capacity is not available. It happens today in various parts of the world as streaming audiences keep breaking records. This is the Streaming Tsunami that broadcasters’ audiences will cause as over-the-air viewing shifts to over-the-top viewing.

Figure 1: What happens when streaming demand is manageable within available capacity.

Figure 1: What happens when streaming demand is manageable within available capacity.

Figure 2: What happens when streaming demand is not manageable within available capacity.

Figure 2: What happens when streaming demand is not manageable within available capacity.

How big will the Tsunami be?

The Origin platforms (see article here) that initiate the delivery of these streams are a mixture of on-premise and cloud-hosted platforms, with multi-Terabyte and sometimes multi-Petabyte VOD libraries in storage. The stream size at this point (imagine single streams starting at the top of a mountain) is small (see Figure 1 above). For a single linear channel or FAST channel or VOD asset, the stream will be based on the size of the multi-bitrate ABR ladder. So, if there is a top bitrate of 15Mbps and a low bitrate of 1.5Mbps, with a 5-step bitrate ladder, the total aggregate output for the stream could be about 33-34 Mbps. Even with 100 channels being originated simultaneously, Origin egress would only reach about 3.4 Gbps. Large VOD libraries can create significant Origin egress overall, but VOD files can be cached, even proactively cached, and the Origin egress can be managed to keep it under control.

Once these streams are flowing towards the consumers, the CDN picks up the load. Continuing the water analogy, we can think of the CDN as the fast-flowing river that collects many streams and delivers them into the ISP networks for onward distribution to the waiting consumers. The size of this river is determined by the number of viewers requesting video content, and the bitrate of each video file. As streaming viewership increases from hundreds of thousands to tens of millions, the size of the river could increase as shown in Figure 3 and Figure 4.

Figure 3: Streaming Growth Scenarios.

Figure 3: Streaming Growth Scenarios.

Figure 4: Streaming Capacity Required in the 4 Growth Scenarios.

Figure 4: Streaming Capacity Required in the 4 Growth Scenarios.

The final bar of 200 Tbps is the huge Tsunami. Even the 50 Tbps is a very large wave that can cause large floods in countries that are typically delivering to peak streaming audiences of “only” 1 million.

Leading streamers in large European countries already delivering premier sports content to streaming-only audiences of 3-4 million people are reaching peaks of about 20 Tbps. In the USA, India and China, sports events on streaming-only platforms are emerging and are expected to drive much bigger audiences. In the large European countries, this super-large streaming audience of 3-4 million people compares to the daily prime-time TV viewing audience of about 20-25 million people – only 12-20% of prime-time. When streaming audiences reach full prime-time viewership, and average bitrates grow to deliver better quality viewing experiences, we can expect 20 Tbps to become 200 Tbps.

Most countries do not yet have this scale of streaming viewership and large peak audiences on streaming services. Instead, most of the content is delivered via IPTV, CableTV, Satellite or Terrestrial. But the Tsunami is certainly coming.

You might also like...

The Business Cost Of Poor Streaming Quality

Poor quality streaming loses viewers at an alarming rate especially when we consider the unintended consequences of poor error reporting on streaming players.

Video Quality: Part 1 - Video Quality Faces New Challenges In Generative AI Era

In this first in a new series about Video Quality, we look at how the continuing proliferation of User Generated Content has brought new challenges for video quality assurance, with AI in turn helping address some of them. But new…

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.