In the last article in this series, we looked at why integrated monitoring is a necessity in modern broadcast IP workflows. In this article, we dig deeper to understand what is new in IP monitoring and how this integrates with traditional workflows.
For broadcasters, monitoring is not limited to the transport stream. Due to the high bit rate of video, it’s often more convenient to view the decoded pictures on a screen as opposed to looking at thousands of numbers flying-by representing the video. However, we still need to look at some of the measured values and a combined IP, video, audio and metadata system can provide this.
Software systems provide the flexibility and scalability to provide a holistic monitoring system, but custom hardware designs still have their place, especially when we combine the two together.
Fig 1 – ST2110 specifies tight timing tolerances of the packet distribution and this diagram shows a consequence of this as the instantaneous peak rate is significantly higher than the long term average, this leads to the buffers being kept small so the latency remains low.
Measuring IP packet timing is critical for ST2110 systems due to the tight tolerance of the sender timing specification. This ensures that packets are evenly gapped and don’t burst so that buffers can be kept small, thus leading to low latency. Calculating the time distance between packets for compliance is often a challenge in fully software devices as the NIC (network interface card) will copy the packets from the ethernet network directly into a memory buffer and wait for the operating system to transfer the data to the application memory. This results in any quantitively temporal relationship between the packets being destroyed.
There are some software acceleration methods available to improve and speed up packet processing, such as kernel bypass, however, this still does not restore the temporal information destroyed by the NICs’ receive buffering process. Even writing the data packets one by one directly into the application buffer to bypass the NICs buffer wouldn’t help as the timing reference in most operating systems uses a software wrapper for the hardware clock resulting in indeterminate measurement jitter.
One method that is available to improve temporal packet measurement on the actual ethernet wire is achieved using the hardware timestamp. Here, the NIC appends a field representing the time that the packet entered the NIC (prior to being stored in its buffer) so an absolute timestamp of the packet can be achieved and maintained throughout the packets history when residing in the server. As the NIC has one source of calibrated time-truth, all other packets entering it will have a timestamp relative to its own absolute time so meaningful temporal packet measurements can be achieved.
Not only does this method require specialist NICs but it also requires a great deal of knowledge of the underlying operating system on the part of the programmer as custom software drivers will need to be provided to take advantage of these features. Furthermore, the NIC can provide hardware PTP synchronization so greater PTP measurements can be achieved.
Combining the high-speed hardware with the flexibility of the software provides a monitoring solution that specifically meets the unique needs of broadcasters.
Sports has also driven the adoption of HDR and WCG. However, the transition to HDR is much more complex than moving from HD to UHD and 4K. Whether down converting HDR to SDR or vice versa, the high dynamic space is difficult to convert and even more difficult to provide dual SDR and HDR services. SDR is still needed for HD and providing two cameras at each camera position, along with dual channeling all the production workflow is just not viable. Converting between SDR and HDR further demands accurate and flexible integrated monitoring.
Software flexibility further allows overlays to be placed on the image allowing those who are less familiar with the traditional waveform monitoring type products to take advantage of the vast amount of information available. Whether this might include HDR and exposure zones, or colorimetry gamut detection, the availability of flexible monitoring screens can meet the needs of many different production skill sets and engineering disciplines.
The new breed of network IP broadcast monitoring systems encourages distributed monitoring and viewing. The signal analyzing device can be separate from the screen presenting the data allowing multiple screens to be physically separated from each other and the analyzing device itself. This physical disconnect, made possible through IP networks, helps keep system infrastructures simple and reliable as specialist cabling is not required in each monitoring position.
The data analyzer and capture unit acquire and process the data from anywhere on the network and provide all the real-time analysis near the connection in question. Monitoring information provided to the user is a representation of the data that can be streamed to a whole multitude of devices, from desktop units to hand-held WiFi devices. Thus, freeing the user from the confines of working within the proximity of the physical monitoring device.
Hardware form factors have had a lasting effect on monitoring due to the ergonomic requirements of operational positions. Racks engineers, production crew and editors all need easy access to the monitoring screens and the separated nature of the data analyzer and capture units from the remote monitoring further enables this, especially when operational positions need to be attended for many hours at a time.
Advanced monitoring solutions are taking the risk out of migrating to IP. Being able to see what is going on at the transport layer within the context of the media is essential for any broadcaster looking to migrate to IP. Having software enabled systems makes the ability to upgrade to new formats more straightforward as the key hardware, such as hardware timestamp enabled NICs, works across multiple media specifications. Furthermore, future proofing is guaranteed and as an example, moving to 8K becomes a matter of upgrading the software.
You might also like...
OTT is driving the next great rebundle. After years of D2C streaming, unbundling and fragmentation, we are now reaching a stage where we have so many D2C Apps that consumers are looking for simplicity and convenience again.
Time base correction is an enabling technology that crops up everywhere; not just in broadcasting.
As streaming platforms and viewership continues to grow, so too does the drive towards harvesting maximum ad dollars. With so many OTT services available, a growing number of consumers are willing to accept ad-supported programming in return for lower subscription…
As broadcast facilities and other organizations that use media to educate and inform continue to carefully make the move to video over IP, they currently face two main options, with a range of others in the wings. They may opt f…
Media streaming over the internet is unique. Packet switched networks were never designed to deliver continuous and long streams of media but instead were built to efficiently process transactional and short bursts of data. The long streams of video and…