Analysis and Monitoring of IP Video Networks to Ensure High QoS

Migration towards ST 2110 and ST 2022-6 video networks for production and content delivery is picking up pace as the advantages of IP versus traditional SDI over coaxial cable carriage become more evident. The key drivers of IP include the introduction of more flexible and scalable business models based on virtualization and cloud technologies, along with the economies of scale and speed of technology development that stem from the use of commercial off-the-shelf (COTS) IT equipment.

While these benefits are compelling, the migration to IP video networks nevertheless poses significant technical challenges for broadcast engineers. While SDI over coaxial cable was designed as a dedicated link for synchronous, point-to-point delivery of constant high bitrate video, IP infrastructures are typically asynchronous in nature, and this characteristic presents major issues for real-time video delivery due to the potential for network congestion, latency, and jitter.

Sources of video network congestion

To achieve a high Quality of Service (QoS) with IP video, the network traffic flow should avoid excessive peaks that can cause over-flowing of switch buffers. In reality, the inherent burstiness of IP networks plus bandwidth constraints can result in unmanaged traffic levels, which can create packet congestion and latency as router ports become blocked due to buffer exhaustion. This type of packet congestion can be exacerbated in multi-hop infrastructures, with the different paths taken by signals potentially causing further variations in network latency.

Typical video network with packet congestion.

Typical video network with packet congestion.

These sources of network congestion and latency will delay the arrival of video packets and, in turn, potentially lead to significant jitter problems. In general terms, jitter is a deviation in signal periodicity. In the case of an IP video signal, jitter is a deviation from the expected packet arrival periodicity. Excessive deviations in Packet Interval Time (PIT) — also known as Inter Packet Arrival Time (IPAT) ­— can lead to packets being stalled, and to loss of packets at the receiver.

Ultimately, if it is not addressed, jitter can seriously impact QoS for broadcasters. This is particularly true for a low-latency system that requires a small receiver buffer size. Therefore, in broadcast video networks, it is vital to ensure that excessive deviation past the expected interval is not occurring, as this risks stalling the signal (due to receiver de-jitter buffer underflow). Broadcasters also must prevent too many packets from arriving with smaller-than-expected intervals, as this can overflow the receiver de-jitter buffer and lead to packet loss.

Both excessive deviation and packet overflow lead to video impairment and, in extreme cases, a loss of the video signal. However, with the ability to monitor and diagnose network congestion, along with associated jitter problems, broadcasters can maintain a healthy video network that supports reliable video delivery.

Network congestion monitoring and diagnosis

Jitter can be measured through observation of variations in the Packet Interval Time (PIT). Analysis of the PIT distribution of a video signal will provide an indication of its health, and warn the engineer of any broadcast critical network congestion.

By plotting a PIT histogram, the broadcast engineer can gain a real-time view of how network congestion is affecting a video signal. Measurement of the PIT mean, as well as minimum and maximum values, offers instant network analysis at-a glance.

PIT analysis histogram

PIT analysis histogram

In a “perfect” network, a video signal would have constant periodicity, without jitter, and all PIT values would be the same. In a network with very low jitter, the engineer would expect to see a normal distribution, with the vast majority of PIT values in and around the signal period (the expected interval arrival time). However, the reality of congestion in networks typically yields a broader distribution of PIT values around the expected nominal value.

Hence, a healthy video signal will have a distribution peak centred around the expected PIT. Due to the individual characteristics of a network, some significant jitter might be tolerable, but a high occurrence of jitter at the extremes would potentially lead to video signal impairment or loss. An impaired video signal will have a packet distribution characterised by a high occurrence of extremely long or short PIT values and/or by a distribution mean different from the expected signal period.

In addition to performing real-time jitter measurements, the engineer can track PIT variance over time to gain a longer-term monitoring perspective. Logging this data can provide vital information on the health of a network. For instance, a deterioration could be indicated by increased maximum PIT and a steadily rising mean. A PIT logging tool can also provide historical information on network congestion health at the time of an on-air incident.

IP analyser/generator stream selection.

IP analyser/generator stream selection.

However, it’s not enough to analyse a video network when there’s a problem. Broadcast engineers need to stress test their facility as their IP network evolves, and as new devices are added. A packet profile generator tool allows an engineer to analyse the video network for vulnerability to congestion and jitter by stress-testing the response of the facility to IP video signals transmitted under a variety of network conditions. The packet profile generator can flag network congestion issues before they become a real problem.

Packet profile generator.

Packet profile generator.

A packet profile generator displays a histogram showing the generated signal’s PIT. With this information, it is possible to adjust the timing to simulate network-introduced packet interval timing jitter. The engineer can use this capability to create custom profiles for testing and then also save network distribution profiles for rapid re-use at a later time. In conjunction with IP video packet analysis tools, the packet profile generator provides a powerful capability for network stress testing and fault diagnosis.

Conclusion

IP video networks have created a new set of test and measurement challenges for broadcast engineers, especially with respect to avoiding network congestion. However, new IP signal generation, analysis and monitoring tools simplify traffic analysis and network testing, thereby empowering broadcasters to avoid serious jitter issues that can jeopardise broadcast Quality of Service.

Neil Sharpe is Head of Marketing for PHABRIX.

Neil Sharpe is Head of Marketing for PHABRIX.

You might also like...

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Network Orchestration And Monitoring At NAB 2024

Sophisticated IP infrastructure requires software layers to facilitate network & infrastructure planning, orchestration, and monitoring and there will be plenty in this area to see at the 2024 NAB Show.

Encoding & Transport For Remote Contribution At NAB 2024

As broadcasters embrace remote production workflows the technology required to compress, encode and reliably transport streams from the venue to the network operation center or the cloud become key, and there will be plenty of new developments and sources of…

Standards: Part 7 - ST 2110 - A Review Of The Current Standard

Of all of the broadcast standards it is perhaps SMPTE ST 2110 which has had the greatest impact on production & distribution infrastructure in recent years, but much has changed since it’s 2017 release.