ON TARGET... with all the latest IBC 2017 show information and product coverage, by the editors of The Broadcast Bridge - filtered by category.
Click here

Test, QC & Monitoring Global Viewpoint – September 2017

Where is the IRE in IP?

I was recently at a graphic technology event and while chatting I mentioned my niece was a graphic designer with an understanding of our industry. A colleague commented that no one knows what IRE is anymore. I had to agree thinking she probably didn’t either. The more serious issue is where is the relevance to IRE in the digital world? His response was that it was just good that pixels are finally square and not in any aspect ratio.

What about IRE?

This opened an interesting conversation. First, IRE is a unit of measurement for analog composite video. Its name is derived from Institute of Radio Engineers, who assigned a value of 100 IRE to the 1 Volt Peak-to-Peak measurement that defined the range of the video signal from black to white. These values enabled operators and engineers to accurately measure a video signal and assure it conformed to standards and met quality control.

A value of 0 IRE corresponded to the voltage value of the signal during the blanking period. The sync pulse was normally 40 IRE below the 0 IRE value so there were 140 IRE that equaled the 1 volt peak-to-peak measurement. Dedicated test and measurement tools like waveform monitors were equipped with these standards marked on the screen. Color was based on the chrominance subcarrier frequency and operators typically used a vectorscope to visualize the X-Y relationship of two signals.

SDI grandfathered the waveform monitor and vectorscope, but added new waveforms including Gamut, Eye Pattern and Jitter to expand signal monitoring. These tools continued to display the traditional analog parameters as well.

Live and recorded media underwent the same measurement criteria. If the video looked good on the scopes then it was okay for broadcast or record. If pre-recorded media needed to be played out, it had to meet the same specifications and look good on the scopes.

Where is the IRE in IP?

In the IP and file-based world, we have software that looks at the signal and determines if it is within certain parameters, or corrupt or outside of range. With IP, are there similar signal parameters engineers and operators need to understand? As a bit stream exits a camera will the video shader use a waveform and vectorscope on the digitally streamed signals? Is the signal from the imaging device being contoured directly or is there one stream to the shader and a different version as the output? If the output is encoded for display on a traditional video monitor, how does the engineer know if the encoder contributing artifact?

Where are the standards?

I have asked test and measurement manufacturers what they are measuring and where are they getting the specifications of what is acceptable. They all respond with ranges and claim there are currently no published parameters specific for digital and IP signals.

The Phabrix Qx 12G video analyser is designed for next generation, hybrid IP/SDI environments using 4K/UHD (12G/6G/3G-SDI) and HD-SDI plus SMPTE 2110 and 2022-6. One advantage of such tools is that they can perform a wide range of operational tests on both analog and digital signals across new picture formats. Click to enlarge.

The Phabrix Qx 12G video analyser is designed for next generation, hybrid IP/SDI environments using 4K/UHD (12G/6G/3G-SDI) and HD-SDI plus SMPTE 2110 and 2022-6. One advantage of such tools is that they can perform a wide range of operational tests on both analog and digital signals across new picture formats. Click to enlarge.

Yet, how is color space measured when its bits of data? What about luminance? If HDR expands the imager output to its full range, what is that as a measurement? If I care about bitrate then what are the parameters for bitrate error?

What about the K’s? There are now pixel measurements including pixel depth and resolution. Does a pixel have a well-defined structure for light output and color rendition? Are all pixels equal?

What about processing? If we don’t know what the base is, how do we accurately detect or make changes? There is plenty of discussion about synchronization between PTP and genlock, and which formats will be deemed standard, but we are not hearing much about how to set and maintain levels.

There are several products available for checking files after they are written. What about checking these files as they are written? Is there a way to monitor the RAW image or audio before its encoded? On the other hand, should we apply the same standards to the internal encoding that we do to production encoding? Analog signals were voltage and frequency and easily measured with specially configured monitors. Once things went digital SDI – Serial Digital Interface, we need newer tools to work in this digital space.

What constitutes a “good” signal?

There should be clear and understandable video and audio parameters so operators can configure the test, measurement and monitoring tools. While we are concerned about network latency and which essence format should be standard, we seem to be overlooking the most important part of programming – are the picture and sound “good”.

As engineers and operators struggle to understand the new signal formats, they need also to know what parameters to monitor. Making it look like the old stuff does not work anymore. The younger people that entered broadcast and production media in the digital era have no analog reference. All they know is digital.

They may be more comfortable using software tools than oscilloscopes and spectrum analyzers for test and measurement. That may be fine, but such software-based tools show various parameters and ranges based on default settings.

This Bridge Technologies PocketProbe for iPhone or Android contains the same OTT Engine found in the company's larger VB1, VB2, VB3 and the 40G VB4 series probes. The small size enables engineers to validate and analyze http variable bit-rate streams with only a cell phone.

This Bridge Technologies PocketProbe for iPhone or Android contains the same OTT Engine found in the company's larger VB1, VB2, VB3 and the 40G VB4 series probes. The small size enables engineers to validate and analyze http variable bit-rate streams with only a cell phone.

Operators need a set of published guidelines explaining why the default was chosen. Remember when white could either 75 or 100 IRE and how the variance in the range affects the signal?

Network management and monitoring have overtaken much of the conversation about digital signals. There are more concerns about file checking in real time or faster than real-time than in the quality of content.

In the great live IP wars, how do we test, measure and monitor ST-2110, Aspen and NMI? If ST-2110 separates the essence, where are the IP video and audio tools to measure the essence? If operators will use the same tools as before, shouldn’t the engineer have the ability to check the signal before any encoding takes place?

Color matching and seamless transitions have traditionally relied on special tools to control and contour with accuracy. Are we now relying on arbitrary parameters with values we enter to some application without a good understanding of what is actually changing in the signal?

The IRE gave us precise guidance in the monitoring and adjustment of analog video parameters including amplitude, sync and timing along with defined and understandable units.

I ask again, where is the IRE in IP?

Editor’s Note: Gary Olson has a book on IP technology, “Planning and Designing the IP Broadcast Facility – A New Puzzle to Solve”, which is available at bookstores and online.

Editor’s Note: Gary Olson has a book on IP technology, “Planning and Designing the IP Broadcast Facility – A New Puzzle to Solve”, which is available at bookstores and online.

Let us know what you think…

Log-in or Register for free to post comments…