Video Dropouts and the Challenges they Pose to Video Quality Assessment

The media industry is rapidly adopting file-based workflows in all stages of the content lifecycle including transcoding, repurposing, delivery, etc. Additional complexities could be introduced during media transformations, which if not handled properly, could lead to issues in video perceived by the end consumer.The issues are due to errors caused by media capturing devices, encoding/transcoding devices, editing operations, pre- or post-processing operations, etc. A significant majority of video issues nowadays are due to the loss or alteration in coded or uncoded video information, resulting in the distortion of the spatial and/or temporal characteristics of the video. These distortions in turn manifest themselves as video artefacts, termed hereafter as video dropouts. Detection of such video quality (VQ) issues in the form of dropouts are gaining importance in the workflow quality checking and monitoring space, where the goal is to ensure content integrity, conformance to encoding standards, meta-data fields and most importantly, the perceived quality of the video that is ultimately delivered. This end video quality can certainly be measured and verified using manual checking processes, as was traditionally the case. However, such manual monitoring can be tedious, inconsistent, subjective, and difficult to scale in a media farm.

Automated video quality detection methods are gaining traction……..

This paper discusses various kinds of video dropouts, the source of these errors, and the challenges encountered in detection of these errors.

While adoption of file-based workflows provided more flexibility with the basic paradigm of file processing, it has also added complexities during media transformations. Improper handling of these complexities can lead to perceived video quality issues for the end consumer. The issues are due to errors caused by media capturing devices, encoding/transcoding devices, editing operations, pre- or post-processing operations, etc. A significant majority of video issues nowadays are due to the loss or alteration in coded or uncoded video information, resulting in the distortion of the spatial and/or temporal characteristics of the video. These distortions in turn manifest themselves as video artefacts, termed hereafter as video dropouts. Detection of such video quality (VQ) issues in the form of dropouts are gaining importance in the workflow quality checking and monitoring space, where the goal is to ensure content integrity, conformance to encoding standards, meta-data fields and most importantly, the perceived quality of the video that is ultimately delivered. This end video quality can certainly be measured and verified using manual checking processes, as was traditionally the case. However, such manual monitoring can be tedious, inconsistent, subjective, and difficult to scale in a media farm.

Automated video quality detection methods are gaining traction over manual inspection as these are more accurate, offer greater consistency, have the ability to handle large amount of video data without loss of accuracy and moreover, can be upgraded easily with changing parameters and standardizations. However, automatic detection of video dropouts is complex and a subject of ongoing research. The source where the artefacts are introduced has a bearing on the way the artefact manifests itself. Automatic detection of the variety of manifestations of video dropouts requires complex algorithmic techniques and is at the heart of a “good QC tool”. This paper discusses various kinds of video dropouts, the source of these errors, and the challenges encountered in detection of these errors.

You might also like...

Microservices Poised To Take Self-Serve To Another Level

Microservices enable broadcasters to find new ways to adopt, engineer, operate and maintain the value of their solutions. For vendors, microservices provides opportunities to offer what could essentially be a self-serve menu for clients rather than building bespoke workflows internally.…

Data Recording: Error Correction - Part 16

Error correction is fascinating not least because it involves concepts that are not much used elsewhere, along with some idiomatic terminology that needs careful definition.

Data Recording: Error Handling II - Part 15

Errors are handled in real channels by a combination of techniques and it is the overall result that matters. This means that different media and channels can have completely different approaches to the problem, yet still deliver reliable data.

Data Recording: Error Handling - Part 14

In the data recording or transmission fields, any time a recovered bit is not the same as what was supplied to the channel, there has been an error. Different types of data have different tolerances to error. Any time the…

Data Recording & Transmission: The Optical Drive - Playback - Part 12

Optical disks rely totally on the ability of the pickup to follow and focus on the data track. It is taken for granted that these mechanisms are phenomenally accurate, work at high speed despite being made at low cost and…