Monitoring has always been the engineers’ best friend as it turns apparent chaos into order and helps us understand what is going on deep inside a system to deliver high-quality pictures and sound. As OTT continues to play a more prominent role, the need to monitor internet distribution systems is becoming increasingly compelling.
OTT and VOD have a unique property that transcends traditional broadcast systems, that is, the digital connection to the viewer is bidirectional. This gives media content providers and broadcasters unprecedented access to the viewing habits of our audiences. OTT and VOD playback devices, such as those found in mobile phones, notepads and smart TVs, request connections to a program stream. A side effect of this is that we are able to determine whether viewers are watching or not.
Understanding how many viewers are watching a service also has a double- edged effect because if we know we’re losing viewers then we must do something about it. In the past, we’ve been able to assume all viewers are receiving an RF transmission when a single off-air monitoring point is working, but this assumption is no longer valid. OTT and VOD systems are incredibly complex and there are many points of potential failure. This is further exasperated as there are potentially competing service providers outside of the traditional broadcast chain, all processing the program stream.
One of the fundamental challenges facing us is that the internet was never designed to distribute low latency, high bandwidth, and continuous audio and video. Engineers and technologists have had to design and develop new techniques to stream audio and video services over the internet and this evolution has inadvertently led to the deployment of highly complex systems.
Monitoring is key to demystifying complexity, but internet delivery has highlighted the need to monitor many more systems other than the traditional baseband audio and video. Adaptive bit rate (ABR) delivery systems send multiple streams of the same program to viewers devices. Each of these streams varies in bit rate speeds to take into consideration the environmental conditions the viewers find themselves in. Their devices regularly switch between the higher and lower bit rates as the bandwidth available to them changes.
ABR is ubiquitous for streamed internet delivery and broadcasters and media content providers can no longer assume the viewer is just a passive consumer of streamed audio and video. The regular switching between bit rates of the viewing device makes life just a little more interesting as the adaptive nature of the switching algorithms potentially makes the quality of experience slightly different for each viewer. There are an incredible number of combinations of protocols, viewing devices and software versions.
In time the complexity will reduce, and systems will be easier to understand and maintain, but for the time-being, we must monitor the whole streaming experience from the broadcaster or media content streamer all the way through to the viewers device. Monitoring is more important to us now than ever, but we must analyze more than just the baseband audio and video. An incredible array of systems connect together to make OTT and VOD possible, all of them must be understood, analyzed and monitored.
Broadcast monitoring is an intrinsic part of the workflow. From the moment the first television signals were broadcast in the 1930’s knowing how the signal was performing was critical to reliably delivering video and audio to the home viewer. But as OTT continues to play an important role in a digital IP world, do we still need monitoring? And if so, why?
A proliferation of live sports channels, film networks and light entertainment delivery services have succeeded in providing home viewers with more choice than ever.
Although major live sports events tend to be licensed to a host broadcaster, a whole plethora of syndicated broadcasters often provide access to the same event.
If a transmission breaks, then the viewer has the option of switching to a different service provider. The irony is, the break-up of the pictures and sound may not be the fault of the media content provider due to the complexity of modern OTT distribution systems, but it is their responsibility. In the old days of terrestrial and satellite only distribution, the broadcaster could be confident that if they could receive a stable and high- quality off-air feed, then their viewers would have the same quality.
Good Leaving Me
The old adage “it’s good leaving me” was generally accepted as transmitter feeds were common to the broadcasters monitoring system as well as their viewers. Cable distribution started to complicate this, but embedded monitoring helped highlight potential issues quickly. They also generally benefitted from being a closed private network, so the Telco operators had full control and visibility of their distribution.
OTT has complicated the seemingly simple terrestrial, satellite and cable distribution systems as it relies heavily on routing signals through many potentially unrelated networks and infrastructures. Furthermore, viewers are no longer localized, that is, they may well be far away from the limits of the transmitter network. This not only applies to towns and cities, but countries too. It’s not unreasonable for viewers in the USA to want to watch a European soccer match from the USA. We can send the video and audio over dedicated SDI networks to the USA, but there may also be viewers in Canada, Australia and Japan. Viewers no longer accept regionalization and localization as limits to their viewing experience and expect to see live sports events from anywhere in the world.
We have three fundamental challenges with distributing media over the internet, especially for live events; the internet was never designed to transport real time video and audio, and there are many independent vendors within the internet all looking to deliver high-quality video and audio to our homes that may not be aligned with each other. Third, in the internet business, the owners of the content and the owners of the transmission networks are usually separate entities – often resulting in finger pointing in case of bad service experiences on viewer’s end devices.
To understand the full challenge of OTT broadcasting, it is helpful to start at the viewer and then work back through the various networks to the broadcaster. In itself, OTT is a one-to-many delivery system, similar to terrestrial, satellite and cable broadcasting. However, it fundamentally differs as the home viewers device, whether it’s a mobile phone, notepad or smart-TV, pulls the video and audio from the broadcaster.
Although multicasting through the internet is a topic of considerable research, currently, there is no commercially viable method of multicasting from a media content provider directly to a viewer (through the internet). Terrestrial, satellite and cable distributions are a form of multicasting as there is a one-to-many mapping, but without the complex protocols used in IP networks.
Figure 1 – In traditional broadcasting the transmitter continually streams video and audio to the home viewer as the television is a passive receiver. For OTT, devices using Web technology, such as mobile phones, notepads and smart TVs, all request segments of data from the origin server (via the edge server), therefore the viewers device must initiate the request for video and audio streams.
It’s worth remembering that multicasting is used extensively in video and audio over studio IP networks such as SMPTE’s ST-2110, but it is not available to us in the internet, and this is why we need CDNs for OTT distribution with the associated monitoring.
CDNs have grown in popularity in recent years as they help solve a lot of the distribution issues the internet presents, primarily the lack of multicasting and the competition for data bandwidth. Broadcasters are used to working with uncontested connections to distribute their video and audio over, but this all changes with OTT delivery.
As well as providing faster network delivery, CDNs also consist of storage, transcoding and data packet processing. Instead of thinking of a CDN as just a network connection providing higher availability for broadcasters to geographic regions, we must also think of it as a complete backbone delivery system.
Any viewing device based on web- browser technology, such as cell phones, notepads, laptops and smart-TV’s use the HTTP method of requesting data from a webserver. If a user wants to look at a webpage, then they enter the address into the browser and the device then requests the data from the webserver. For most situations this works relatively well and there is minimal delay for the page to be displayed. When watching a program or film, the same method applies, but the requests are sent much faster and with significantly greater regularity.
In its simplest form, a webserver is just a computer that serves HTTP requests on the IP interface. As more people request webpages then the load on the server increases. Eventually it may overload, that is, there are too many user requests for it to be able to respond in time. This is usually fixed by web hosts using load-balancers; a device that can route the user requests to multiple servers all providing the same webpage data. Load- balancers work well as user requests are generally well temporally distributed and the data transferred is relatively low.
Bottleneck’s And Congestion
This is not the case with video and audio OTT distribution. Not only is there a bottleneck at the program server, but across the network too. The program server, often referred to as the origin server, resides somewhere close to the broadcaster and is the first point in the OTT distribution network. The program sent from the broadcaster is usually encrypted to reduce the risk of pirate copying and then sent to the origin servers.
If just a few viewers requested programs from the origin server then the load on it would be relatively light but the data load across the internet would be much higher than just simple webpage requests. As more users request programs then the load on the origin servers as well as the internet connections increases.
The challenges are further exasperated as adaptive bit rate systems such as DASH and HLS require multiple parallel streams of variable bandwidths. Consequently, instead of pulling just one stream over the internet, users require access to six streams (approximately) of varying bit rate, and that’s before we start considering manifest and other housekeeping files.
This system may work for a few dozen viewers but is clearly unsustainable as the load on the origin servers and internet significantly increases as multiple thousands and tens-of-thousands of viewers watch a live event.
Broadcast Bridge Survey
You might also like...
At this year’s IBC Show in Amsterdam, finally in-person after two years, remote production solutions were scattered throughout the exhibition floors, to no real surprise. Reduced costs, travel and shipping expenses, scalable infrastructure and efficient use of resources were a…
The more digital TV technology advances, the more the fundamental elements of TV remain the same.
One cannot get very far with electricity without the topic of batteries arising. Broadcasters in particular have become heavily dependent on batteries to power portable equipment such as cameras and lights.
The Sponsors Perspective: Proactively Monitor IP Video Networks & Essences With Inspect 2110 & PRISM
For over two decades Telestream has streamlined the ingest, production, and distribution of digital video and audio. Today, compared to its SDI/AES-based predecessors, IP video adds exciting new challenges to these workflows.
IP connectivity delivers flexibility and scalability but making the theory work often requires integrated solutions that are adaptable, open, and promote interconnectivity.