All video services begin with some form of content production and acquisition, so we will assume this is constant regardless of the content distribution method.
From that point on, the OTT Ecosystem contains a set of functions that differ significantly from a terrestrial or satellite distribution model, but which are actually very similar to those used in cable and telco “IPTV” distribution models. At a high level, the functions cover content processing, content storage and networking. But in the detail of each function there are significant differences between OTT and the other distribution methods that span everything from video formats to key performance indicators for quality of experience. This article describes each function and some of the key development areas being addressed today as OTT services expand.
Starting from the left, management of the content through intelligent ingest and categorization supports content monetization as well as the user experience of content search and discovery. Metadata management helps create a personalized viewing experience that is enabled by unicast delivery. Customer management refers to authentication, preference-capture and billing processes. Digital Rights Management (DRM) ensures content is only delivered to devices in permitted geographical locations that comply with the content rights. Dynamic Ad Insertion (DAI) can be applied at different points in the ecosystem to insert specific advertising for the individual viewer, which is an example of just-in-time delivery of the personalized experience that OTT enables.
Whether the content is a live stream or a file, it must be prepared for OTT delivery, which focuses on the bit-rate that will be processed successfully by the delivery network and played by the requesting device. Adaptive bit-rates are now the standard compared to constant bit-rates so that streams can be sustained even in networks with highly variable performance. The goal is to sustain the stream, not to interrupt it, as consumer satisfaction is better when streams are sustained.
The Packager function wraps the video stream in the correct container to be received by different end devices which generally use Apple, Android or Microsoft operating systems. Just-in-time packaging is popular because not every live stream or VOD file needs to be prepared in every package type. The Origin is internet-facing and is normally integrated with DRM systems for just-in-time encryption of the video before it is streamed over the internet. The Packager and Origin functions have developed to integrate closely with Storage to support efficient, large-scale VOD libraries and timeshifted TV viewing on live and linear streams. Storage can be as simple as direct-attached drives, but for larger content libraries the storage is often a multi-petabyte storage system that is routinely ingesting, streaming and recording streams and files. An additional form of storage is the Shield Cache which provides protection to the Origin by storing the streamed content and managing the interfaces either to the CDN(s) or to the end consumer devices.
The ABR encoding, ABR transcoding, packaging, encryption, origination, storage and shield cache functions are often co-located, either on-premise and located next to the playout servers (for live and linear services) or in a cloud service that ingests the live streams and files.
Once a stream is originated it is “crossing” the internet. The miles of intricate transport networks and layers of caching servers (if present in the ecosystem through the involvement of a CDN supplier) now deliver the video to the end device. Intermediate Caching is routinely found in large networks that connect to multiple Edge Cache locations, storing popular video as close as possible to the consumer and offloading traffic from the networks that connect the Intermediate Cache with the Shield Cache or Origin. Edge Caches perform the same function – they store content as close as possible to the end consumer and avoid consuming upstream bandwidth towards the Origin which can add cost and latency.
The concept of edge computing is an important subject to OTT delivery as networks need to handle huge growth in traffic for video in general and must fulfil low-latency requirements for live video in particular. Just like optimizing a pull system, inventory (i.e. video) should ideally exist as close as possible to the consumer with the minimum level of “finishing” and maximum opportunity to personalize. Performing functions at the edge, wherever the edge can be, such as packaging for specific device types and inserting advertisements is the general future trend.
Network ownership and management to reach end consumers means that Internet Service Providers (ISPs) will carry the content for the “last mile”. Generally, this means more than the last mile of course, but the key point is that network operators that provide internet services are key partners in the OTT ecosystem. To connect to an ISP network is generally done in a “meet-me room” provided by Internet Exchange Providers (IXPs), like the London Internet Exchange (LINX) and the Milan Internet Exchange (MIX) that are hosted by Data Center operators. To reach the IXP from the origin-side of the ecosystem generally needs some form of leased line connection, purchased from a telco or CDN provider. At the IXP there are cross-connects between each incoming network and each outgoing network. This is the “peering point” as networks peer with each other. The capacity of the peering point is a key factor in the development of OTT services and increased bandwidth, especially as most Edge caches for OTT services appear on the origin-side of the peering point.
Once through the peering point, the content continues through the ISP’s core network and access networks. OTT is unmanaged data crossing an ISP network, so it is separate to managed video services that ISPs may provide themselves. OTT video will generally follow the same network path as other unmanaged data like website traffic, software downloads, online gaming and videoconferencing.
ISPs are generally focused on expanding bandwidth for all forms of voice and data services that consumers and businesses require. Video requires enormous and growing amounts of bandwidth, with ever-increasing format sizes and resolutions placing pressure on the world’s networks. As more people consume more content through OTT platforms, the broadband networks – whether fixed or mobile – must keep up. Multi-billion £/€/$ investments in network capacity are being announced and managed continuously to support this trend.
Finally, but in fact within seconds or even milliseconds, the packaged content arrives in the end consumer domain. If it is being transported over a fixed-line network, the content will first traverse any consumer premises equipment, like a Wifi router or home gateway, where it will then connect to the device. In a cellular mobile network, this step will be bypassed, but as a result content may only be available in a lower bit-rate due to less bandwidth availability. 5G is planned to be a big step forwards for mobile video delivery.
At the consumer device the content will be received by a player that is installed in the client application or browser. This will convert the video and audio packets and play them on the device. The client application for large-scale OTT services is customized for the OTT Operator to provide the best possible user experience. There are a range of commercially available clients that can be deployed, although many OTT operators develop their own.
Consumer devices are constantly evolving and continuously drive the whole industry forwards. 4K-ready and 3D-ready devices clearly set expectations that there will be 4K and 3D content to consume, even if there is very little content available. This is one area where OTT offers big benefits over terrestrial and satellite delivery, as the general investment in broadband networks is much greater than other networks and so it aligns well with delivering these higher-resolution content forms. In the consumer device arena, one of the main challenges for OTT Operators is the need to work with their client-side ecosystem to ensure compatibility and performance for the viewer.
Quality of Experience (QoE) monitoring and control is generally performed at the client-side for OTT Operators. Quality of Service (QoS) monitoring and control is managed at the server-side of the Ecosystem. Key performance indicators like average bit-rate and rebuffering ratio can be observed on every individual device and automated decisions can be taken to improve performance if necessary, such as re-directing the device to a different CDN or even Origin. Correlation of client-side and server-side data supports proactive decisions for seamless viewing experiences.
A relatively new technology for handling OTT video is peer-to-peer (P2P) networking. This technology offloads traffic from upstream CDNs, Core Networks and Access Networks. Its primary use case is for large audiences for live events when most people are watching the same content at the same time and a consumer device can be treated as a local cache to proliferate a live stream. As OTT services face more pressure to scale efficiently, the management of multiple network types will be an important subject.
While video consumption is why consumers use an OTT service, there are important monetization opportunities for OTT Operators that an internet-based content service enables. A short click to online shopping, betting, and community-based activities can be an integral part of the viewing experience, with particular opportunities for more time-sensitive linear and live content. Clearly, the whole viewer experience is important to manage properly – being inundated with promotional materials in a subscription-based OTT service is not what most people expect – but this is an area that could bring value to both consumers and operators.
The Future Of OTT
The 2020s will be an important decade of growth for OTT services around the world. It is possible, even probable, that by 2030 OTT will be the dominant distribution method in many countries where broadband networks are most advanced.
OTT gives unprecedented opportunity for personalized content delivery, which transforms the relationship between consumers and content providers and enables new methods of monetization. Naturally this will lead to continued technology and business innovation.
The video processing and management technologies described in the OTT Ecosystem will therefore evolve to meet this demand. While the production side of the value chain will continue to innovate to deliver new and exciting consumer experiences, the distribution side will be dealing with the ever-present need to deliver content with efficiency, speed and quality.
It’s going to be an exciting decade in the Media & Entertainment industry!
Broadcast Bridge Survey
You might also like...
In this second instalment of our extended article on monitoring in OTT and VOD, we take a look at the core infrastructure and discuss how to analyze systems to guarantee that video, audio and metadata is reliably delivered through network…
Monitoring has always been the engineers’ best friend as it turns apparent chaos into order and helps us understand what is going on deep inside a system to deliver high-quality pictures and sound. As OTT continues to play a more p…
In this second installment of our extended article looking into HDR for cinematography we look at the practical aspects and applications of HDR.
Part 1 of this series described how network-side QoE (Quality of Experience) measurement is fundamental to proactively assuring the quality of OTT services. At its core, the network-side can be an early warning system for QoS, which in turn correlates to…
High dynamic range and wide color gamut combined with 4K resolution and progressive frame rates have catapulted broadcast television to new levels of immersive experience for the viewer. As HDR and WCG are relatively new to television, we need to…