The Streaming Tsunami: Part 2 - Preventing The Flood

The Streaming Tsunami is emerging as broadcasters tune in to their own streaming video strategies. Today, most broadcasters deliver less than 10% of their total viewing hours via OTT streaming services. But things are changing due to consumer preferences for streaming, broadcaster strategies to focus on streaming, and industry-level choices about spectrum allocation. As described in Part 1, the Streaming Tsunami is big. So, what will we do about it?

There are various technical methods available to prevent the Streaming Tsunami from causing continual flooding in our networks. Scaling out CDN Edges deeper in ISP networks is one way. Using multicast for streaming linear channels is another way. Using peer to peer networks for live events is yet another way. Or we could expand all ISP core networks to handle tens of Tbps and leave the CDN Edge where it is.

Considering each point in reverse order, enormous expansion of ISP core networks is not only expensive but also unnecessary, because it goes against the basic principle that a pull system (i.e., streaming video) should have its delivery capacity (i.e., the point of video egress from the Edge) as near as economically viable to the point of consumption (i.e., the viewer). Therefore, for multiple reasons, expanding core networks is not the right solution.

Second, Multicast is clearly a very useful method for linear and live streaming, which is why it is used extensively in closed-network Pay-TV services like cable TV and IPTV services. But for OTT video services, that are network-agnostic and ride the rivers of the open internet, it is complex to manage the individual network integrations that multicast requires. Plus, when considering our media consumption behaviors, including VOD, Live, and FAST, how much traditional linear TV viewing will we even see in the future, and how much of the total streaming demand can be managed with multicast? Multicast even switches to unicast anyway with timeshifted functionality like pause and rewind, and for any level of service personalization. And finally, an extra consideration is whether a D2C Streamer will have the desired level of service control and visibility from inside a channel aggregator’s delivery platform. It seems like there are various barriers and complexities to implementing multicast for streaming at scale.

Third, Peer to peer (P2P) networking is theoretically beneficial for large live events when devices are close to each other, and one device can act as a delivery server for another device. But this technological approach doesn’t solve for most viewing use cases, there are privacy and piracy concerns to manage, and the live video services it should benefit are the ones that need the most robust delivery method of all anyway. So far, P2P networking does not appear to be a strong candidate for the future of streaming at scale.

The evidently preferred method for handling streaming growth (as already implemented by leading OTT Streamers) is the expansion and wider distribution of the CDN Edge. In some cases, CDNs are deployed inside ISP networks. Of course, this is what the biggest streamers, like Netflix, YouTube, and DAZN already do. It is often called the Private CDN model because the platform capacity is dedicated to the Streamer’s own content delivery, which removes the risk of congestion caused by other platform tenants. Netflix have spent over $1 billion on their OpenConnect platform, which is deployed in 6,000 locations across 170 countries. DAZN have DAZN Edge, reported to be deployed in Multi-Tbps scale in Italy primarily to support the large audiences for Italian Serie A football, but then available to deliver all content for smaller audiences that don’t reach the capacity limits.

The Private CDN model is used by these streaming pioneers to beat the internet congestion caused by prime-time internet usage for video, gaming, e-commerce, and more. Through the Private CDN they assure quality of experience for their viewers. Not only does it deliver QoE, but because of the better QoE viewers watch for longer. A perfect combination.

Key benefits of the CDN technology include the fact that a single CDN Edge server can ingest a single stream (e.g., a live event) and propagate thousands of the same stream which works very well for live and linear events. Also, a single Edge can handle many streams in and many streams out which works very well for VOD or multi-channel content platforms. In addition, the storage capability of a CDN Edge means it can support timeshifted TV functionality like rewind, pause, and fast-forward because it can cache the video. It can also store large quantities of VOD assets that are most popular and in-demand for catch-up TV. It can even support proactive storage of new VOD content that the content provider knows will be popular and therefore wants to pre-cache ready for delivering a great at-scale VOD viewer experience.

In the end, the CDN technology is preferred because it is simpler to implement at a network level than other technologies, and it supports the most video use cases when we consider a complex video distribution environment.

The net effect of this deeper distribution of CDN Edge technology is shown in Figure 1 below. While the dimensions of each section need to be defined on a country-by-country basis according to the level of viewer demand, the principles are consistent. Move the Edge closer to the Consumer. Avoid the bottleneck in the ISP Core Network. Instead of expanding the ISP Core Network, it is better to manage one stream in and multiple streams out from every Edge to reduce the pressure on the existing ISP Core Network capacity, leaving it available for other demands that ISPs can supply.

The Edge – now renamed as a Video Edge in this transformational stage because it is specialized in handling the huge bandwidth consumer of video – is moved to the right of the ISP Core Network. It could be in 10 locations or 1000 locations. It depends on the optimal network design, but the rule is that it needs to be closer to the consumer. The specialism in video is a key point, but it doesn’t mean the compute capacity cannot be used for other purposes. It just means that the capacity needs close management because the primary requirement is to deliver good quality video to viewers which cannot be interrupted. This Video Edge architectural design represents the win-win-win for Media businesses, ISPs, and Consumers. It means better video delivery, less ISP network utilization, and better QoE. And potentially at a lower cost and lower energy consumption than many people think. The next article in the series starts to look at this subject in more detail.

Figure 1: Effective management of the Streaming Tsunami?  Distribute the Video Edge.

Figure 1: Effective management of the Streaming Tsunami? Distribute the Video Edge.

Environmental Concerns

As we consider the scale that streaming video could reach and the demand it will place on our broadband networks, there are also very important concerns about energy consumption related to streaming video, and question marks raised over the OTT distribution method’s sustainability credentials.

Recent articles (see here) have highlighted that the end-to-end technology value chain for streaming is less energy efficient than over-the-air (OTA) broadcasting. The biggest drivers of energy consumption are the broadband routers, set-top-boxes, and TVs. This situation will need to improve – and it will - as the technology used for internet access and video streaming is simplified and made more efficient.

From a video distribution perspective, the at-scale CDN Edge scenario needs to be inspected closely. Edge servers are based on COTS hardware, and there is a lot of work underway from many CDN companies to deliver video in more and more efficient ways. Some companies like MainStreaming describe their different approach to managing video in an intra-Edge manner that removes hardware layers from their CDN architecture, while focusing only on video delivery (vs. all forms of data delivery) to optimize hardware efficiencies. Other companies like Broadpeak and Varnish have made recent announcements about delivering hundreds of Gbps of egress from their Edge servers, while using significantly less power than previous hardware generations. Chip manufacturers like Intel and AMD are heavily involved in these efficiency-improvement initiatives. Several leading CDN vendors and service providers are members of the recently founded Greening of Streaming organization to participate in the big industry effort to minimize energy consumption from streaming delivery.

Yet while we consider the energy efficiency credentials of streaming, we also need to remind ourselves that OTT streaming reuses telco networks for the delivery of video. For delivering all our video consumption needs today, we also use satellites, terrestrial broadcast towers, Cable TV networks, and IPTV networks. So a first and bigger question is perhaps do we need all of them? Does real efficiency actually come from first streamlining the number of networks into one primary network, and then optimizing the efficiency of that network? Arguably, the internet, based on multi-purpose fixed and mobile telecoms networks, should be the main delivery platform because it can probably reach the largest scale and best efficiency that we need for all forms of communications use.

The discussion will certainly continue for years to come as different requirements and positions play out. The World Radio Conference in Dubai at the end of 2023 is a key decision-making moment as the world’s telecommunications industry and stakeholders meet to debate the allocation of radio frequency.

Flood Prevention Solutions

Achieving high levels of customer satisfaction by delivering the streaming service hygiene factor of very good video quality is critical to the success of media streaming services. Leading OTT Streamers’ choices to invest in Private CDNs, and the very close relationship they have created with ISPs, points the way for the rest of the industry’s Streamers that are now building up the wave of streaming to Tsunami proportions.

The leading streamers have very clear strategies focused on removing stream delivery bottlenecks that cause poor quality viewing experiences. The next part of this article will take their models further and consider in detail a future CDN architecture that could be applied for all broadcasters as they serve the largest prime-time audiences each day. It will describe how the Tsunami can be turned into a well-managed flow, rather than risking the network floods that cause so much damage.

You might also like...

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

The Big Guide To OTT: Part 9 - Quality Of Experience (QoE)

Part 9 of The Big Guide To OTT features a pair of in-depth articles which discuss how a data driven understanding of the consumer experience is vital and how poor quality streaming loses viewers.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Essential Guide: Next-Gen 5G Contribution

This Essential Guide explores the technology of 5G and its ongoing roll out. It discusses the technical reasons why 5G has become the new standard in roaming contribution, and explores the potential disruptive impact 5G and MEC could have on…