Flexible IP Monitoring - Part 1

Video, audio and metadata monitoring in the IP domain requires different parameter checking than is typically available from the mainstream monitoring tools found in IT. The contents of the data payload is less predictable and packet distribution more tightly defined leading to the need to use specialist media stream centric monitoring tools.



This article was first published as part of Essential Guide: Flexible IP Monitoring - download the complete Essential Guide HERE.

ST2110 is the first step for many into the IP world. To keep latency low, the designers restricted ST2110 IP packet distribution to tight tolerances so that smaller receive buffers are required, which in turn leads to lower latency. This is quite a unique method of operation for IP networks as packets in traditional IT workflows tend towards a more flexible distribution. To rebuild the data for the higher-level applications, large buffers are needed at the receiver which in turn leads to higher latency.

Although PTP is used in wider industry, regular NICs (Network Interface Cards) are not able to meet the tight timing constraints ST2110 demands and specialist NICs with hardware PTP processing are required. This further leads to the complexity broadcasters demand from their monitoring equipment.

The data domain in a broadcast IP environment represents the video, audio and metadata. Other than when using test signals, it’s difficult to predict data values with any certainty due to the dynamic nature of video and audio. Our human visual and auditory systems are extremely adept at detecting differences and faults. Hence the reason that broadcast engineers often opt to display the video data on screens and listen to the audio data on loudspeakers.

Monitoring in the traditional broadcast sense is merely representing the underlying data in a different domain, that is vision and sound. Looking at thousands of data samples flying past our eyes may have its occasional use, but the best method we have of detecting faults and monitoring quality is by displaying on a screen and listening on loudspeakers.

Using traditional IT monitoring simply does not provide the level of monitoring we need, it certainly has its uses and it provides us with a great deal of information about the underlying IP distribution and accuracy of data, however, it does not provide us with a useable visual and auditory monitoring system.

It is possible to retrofit video and audio monitoring devices to traditional IT monitoring tools, however, it’s inevitable that code will need to be written to facilitate this and it’s very difficult to achieve a workable link between the two. Often, when looking at a video image we will want to simultaneously look at the waveform diagram and vectorscope, as well as the underlying transport stream. This is very difficult to achieve when using diverse and disconnected unit solutions, especially when working under the pressure of live television, such as high value sporting events.

Having an integrated monitoring solution that easily and ergonomically connects the monitoring of the video, audio and metadata, with the underlying IP transport stream is critical for anybody working in broadcast facility. Although the data in the IT monitoring equipment may be the same as that in the integrated broadcast solution, the ability to switch between the different monitoring domains is key.

There’s also a new set of metadata emerging from the use of systems such as HDR. The metadata is essential in describing the display and acquisition formats so that the best immersive viewing experience can be maintained. We need to be able to not only monitor this data, but also monitor it in real time in the context of the streaming video and audio for the television production.

Advanced monitoring is one of the most important tools for any broadcaster either transitioning to IP or already there. Integrated monitoring takes this process one step further and delivers a complete toolset that helps broadcasters maintain their audiences and enhance the immersive experience.



Transitioning to IP delivers incredible opportunity for broadcasters. But the asynchronous nature of packet switched networks is new for most engineers and being able to understand what is going on within the network is essential. The best method we have of observing a networks behavior is through monitoring and IP even has advantages here too.

With the benefit of hindsight, it’s now clear that SDI networks are relatively easy to understand. The point-to-point connectivity of synchronous signals increases their predictability but at the expense of flexibility. But with IP and the packet switched network topology, we increase flexibility at the expense of predictability.

A consecutive stream of IP packets does not guarantee each packet will take the same route through a network. Even with the relatively straight forward spine-leaf or monolith switch technologies broadcasters are opting for, there is scope for packets to vary their route within the network leading to out of sequence packets arriving at the receiver.

Network Practicalities
Packet multiplexing is the fundamental method of operation within an ethernet switch leading to a potential to increase packet jitter, which in turn will lead to timing anomalies if the jitter is excessive.

It’s clear that there is a lot more to monitor within an IP network than was required with SDI. However, this level of complexity far outweighs the limitations of the static SDI systems and provides dynamic and scalable IP. And that’s before we start considering the advances in the viewers immersive experience with HDR and Wide Color Gamut (WCG).

Sports has traditionally been the leader for exhibiting the forefront of technological excellence. HD, UHD, WCG and HDR are just a few of the technology advances that have been demonstrated in major sports events. Each new technological advance adds further weight to the sports story allowing production teams to continually build on their productions to create truly outstanding programs.

OB Advances
IP is further adding to this list of technology accolades as OB trucks seemed to be a natural fit with the technology. The reduction in equipment space and weight has delivered incredible benefits and that’s before we even think of the application agnostic benefits IP brings.

Monitoring is our window on reality. We have no way of understanding what is happening within a network if we cannot monitor it. With SDI networks we could use a variation of an oscilloscope, however, with IP networks life is much more interesting. Not only do we need to consider the detail of the transport stream, but also how the media specific data being transported is behaving.

This is one of the areas where the SDI analogy starts to break down when thinking about IP networks. The video and audio were an intrinsic part of the SDI transport stream. The bit rate, frame rate, color subsampling and frame size were all a function of the SDI synchronous system and inherently tied to the fundamental clock of 270Mb/s, 1.485 Gb/s or 1.485/1.001 Gb/s, for example. But this is no longer the case with IP networks.

Flexibility And Complexity
By separating the application video and audio from the underlying IP transport stream we’ve massively increased our system flexibility and scalability, but at the cost of complexity.

As each IP packet has its own source and destination address, the network itself can determine the optimal route when transferring packets. This often means that systems outside the direct control of the broadcast infrastructure are determining how packets traverse a network.

Although software defined networks are growing in popularity for broadcasters, and there is some analogy between them and SDI routers, it’s important to remember that the intelligence and routing options exist at the packet level. This routing is fundamental to IP networks and is one of the reasons the internet is so successful, and why IP networks for broadcasters will increase in popularity. However, an unintended consequence of IP networks is that they are considerably more complex than SDI networks, hence the need for flexible monitoring.

IP Packets Underly Media Streams
It’s interesting to think of video and audio essence, and even metadata as just data, and in the IT world, this is exactly what they are. But in broadcast infrastructures we cannot completely separate the data from the IP network as it is intrinsically aligned, especially when we start to think about PTP timing.

SMPTE’s ST2110 can and does work independently of the IP network. This can be seen when media is streamed and stored to a hard disk drive. But unlike more general IT data, ST2110 imposes a rigid timing structure on the streaming and temporal gapping of packets. And this is one of the reasons why specialist monitoring tools are needed for broadcast engineers.

It is possible to use IT centric monitoring and logging tools, and many broadcasters do in some circumstances, but the complex interaction of the media essence and the IP network demands that broadcast specific tools are used. Furthermore, IT monitoring generally doesn’t measure packet distribution to any great accuracy, and certainly not to the detail needed for ST2110. IT expects the IP packet distribution in a network to be bursty and evenly gapped packets are a bit of an anathema.

Supported by

Broadcast Bridge Survey

You might also like...

Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer

The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…

Broadcasting Innovations At Paris 2024 Olympic Games

France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.

Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs

Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.