Test, QC & Monitoring Global Viewpoint – December 2017

Test & Measurement in an IP World

Just like video equipment, test and measurement is evolving from hardware to software, with more and more intelligence incorporated into the product. The engineer sitting at a bench has been largely replaced with automated quality control (QC) to meet the needs and workload of multi-platform delivery. The engineer now performs more of an investigative and trouble-shooting role.

Test and measurement for broadcast equipment falls into two areas. One is for manufacturers, on the test-bench or in research and development. The other is for broadcast engineers, who are installing equipment, or for ongoing maintenance and system quality assurance. Although some of the needs may crossover, in general the requirements of the broadcaster are simpler than the R&D engineer.

The original use of test and measurement in broadcast operations was to line up equipment and check a few basic parameters of the baseband video. This included checking black level, peak white, color phase, noise, color gamut, and timing. The classic tools for this were the waveform monitor (WFM) and vectorscope. In the days of analog video, equipment needed regular line-up and checks to counteract the temperature drifts and ageing which affected performance.

The move from analog to digital meant that many of the issues with analog processing, drift in levels, gain, and phase went away, but nonetheless, the role of test and measurement has expanded by orders of magnitude. Cable and IPTV multiplied the number of streams to be monitored. Over-the-air (OTA), cable, and IPTV all use allocated bandwidth with a high quality of service (QoS) delivery. The world of over-the-top (OTT) is different, and adaptive bit rate (ABR) is the order of the day in order to make best use of the available bandwidth. Not only are the number of streams multiplied by a large factor, but the use of the public internet for delivery adds the need to monitor the QoS.

Although the IP bitstream is very different from analog composite, parts of the user interface would be familiar to an engineer from the 1960s. Even though we have left behind the quadrature modulation of the color-difference channels used in NTSC and PAL, the vectorscope display remains as a familiar representation of the color difference signals. Similarly, the R, G & B channels are still represented as sequential, or parade waveforms.

The Phabrix Qx 12G provides IP, 4K/UHD and HDR generation, analysis and monitoring.

The Phabrix Qx 12G provides IP, 4K/UHD and HDR generation, analysis and monitoring.

No More Dropouts

The move to file-based workflows has eliminated one possible source of distortion, and with it the need to test for those faults. File-based workflows replaced videotape, which could suffer from head clogs, dropouts and other problems which impacted on the video quality. Although tape was very reliable, each record operation still needed at least a start, a middle and an end check.

The move away from analog and videotape has by no means eliminated quality issues with video and audio. The move to file-based operation and live IP streams is only increasing the number of test and measurements that need to be performed to ensure that the content delivered to the consumer is at a suitable quality level.

Since the move to digital processing test and measurement has changed from tools for checking the performance of a system to more of a monitoring role. Checks take place on files and streams rather than equipment. Those measurements on equipment are now the preserve of the R&D engineer or the commissioning engineer.

This move to monitoring inevitably gave rise to automation. Stations had long left it to the equipment to check for extended periods of black and silence or out of specification signals. Automated quality control has evolved ever since, relieving operational engineers of the need for constant eyeballs on the test gear.

However, the introduction of compression introduced a whole different set of problems to look out or. The artifacts of excessive compression are easy to spot by the human observer, but they present more of a challenge for the automated monitoring equipment.

The Need to Measure Increases

The move from linear broadcast, a simple chain through to a transmitter, to multi-platform delivery have created a rapidly expanding set of quality checks. Now that OTA is not the sole delivery method, and a live stream can be delivered in many, many ways, the need for checks only multiplies.

The transcoding of a file or stream to multiple spatial resolutions and frame rates, to different compression standards, with different levels and profiles, means that hundreds of versions of a file or stream could be required to meet the requirements of different delivery platforms.

OTT streaming uses many formats: Adobe HTTP Dynamic Streaming (HDS), Apple HLS, Microsoft Smooth Streaming and MPEG Dash amongst many others. These formats adapt the stream rate to fit the available bandwidth. The content providers have to ensure the quality of service for all these different streams, as well as legacy-method quality checks on the OTA signals. Targeted ad insertion only adds to the number of checks.

The coexistence of SD and HD. amd now UHD, calls for resolution transforms (uprezzing and downrezzing) and there is the need to de-interlace of legacy content. This calls for extensive use of transcoding in the distribution pipeline.

Since the ratification of BT. 2020 engineers have different color spaces to content with, along with transforms between 709, DCI and 2020 spaces. BT 2100 adds further complexity with high dynamic range (HDR meaning HDR and regular versions of content will coexist.

Bridge Technologies VB288 performs objective video and audio monitoring of MPEG-2, h.264/MPEG-4 and h.265/HEVC streams. The content extractor enables operators to inspect massive amounts of content services beyond human eyeball capability with dependable alarming on objective parameters having a QoE impact.

Bridge Technologies VB288 performs objective video and audio monitoring of MPEG-2, h.264/MPEG-4 and h.265/HEVC streams. The content extractor enables operators to inspect massive amounts of content services beyond human eyeball capability with dependable alarming on objective parameters having a QoE impact.

Multiple Delivery Formats

As test and measurement has evolved into a comprehensive quality control and assurance for the production processes, and for the final content as delivered to the viewer, different classes of checks emerge.

  1. Does the original file/stream conform to relevant broadcaster’s technical specifications?
  2. Does the audio track configuration conform to the delivery specification? Are closed captions in the correct format?
  3. Are the audio and subtitles in the correct language?
  4. Are the title and credits the correct language version?

Once a distribution master exists, then a series of transforms and transcodes create the delivery formats.

  1. Have transcodes introduced unacceptable artifacts?
  2. Are the MPEG bitstreams compliant?
  3. Again, are the correct audio, captions, titles and credits packaged with the delivery streams/files?
  4. Are the distribution networks introducing jitter, latency and packet loss to IP streams, which is compromising video delivery?

The sheer scale of the number of measurements that must be performed to ensure a good quality of experience to the viewer has naturally led to automated quality control systems.

Some measurements are arithmetic, like color gamut, white and black levels, but the picture quality of compressed images, especially after transcoding, can exhibit artifacts that, although easily visible to the eye, are more difficult to quantify than the old analog measurements. This proves a challenge to automated QC systems.

Compression artifacts

Peak signal-to-noise ratio is used to give an approximate indication of compression artifacts, but to correspond to the perception of the artifacts by the human visual system a more sophisticated measurement algorithm is needed. Several vendors now have products that analyze picture quality that gives a better match to our perception of compression artifacts.

As resolutions increase to UHD, the old saying “the wider you open the window, the more dirt flies in” applies. Viewers with UHD displays can resolve artifacts more easily and perhaps be less forgiving than with SD and HD imagery.

Artifacts are more apparent as resolution increases

QC could be considered an overhead, but for the service-provider it is essential in order to conform with service-level agreements. For the broadcasters or content publisher quality is an inherent part of a brand. Test and measurement has changed from the basic check and line-up of analog days to a complex set of measurements that cover all aspects of multi-format delivery.

Test and measurement is changing dramatically. What was once the preserve of the bench engineer, now sees the addition of automation as a core part of assuring audio and video quality. Automated QC helps to ensure the viewers sees the best achievable quality for their chosen means of delivery, ranging from the small mobile devices used in emerging nations through to future UHD over-the-air.

Commenting is not available in this channel entry.

Related Editorial Content

Software-Based Test and Measurement

Today’s digital production and broadcast signals are complex. Ensuring these signals are properly operating and meeting standards is an engineer’s key challenge.

PHABRIX QX Adds ST 2110 IP Support

Upgraded Qx range to perform analysis across SMPTE ST 2110 and 2022-6, and it offers new audio features and multi-frame test patterns.

Bridge Technologies Sharpens QoE Focus

Bridge Technologies has transformed TV QoE, using exclusively objective criteria and delivering meaningful, actionable information in real time.