Myth and Reality of Auto-Correction in File-Based Workflows

File-based workflows are ubiquitous in the broadcast world today. The file-based flow has brought enormous efficiencies and made adoption of emerging technologies like Adaptive Bit-Rate (ABR), 4K, UHD, and beyond possible. Multiple delivery formats are now possible because of file-based workflows and its integration with traditional IT infrastructure. However, the adoption of file-based flows comes with its own set of challenges. The first one, of course, is - does my file have the right media, in the right format and without artifacts?

Auto QC is now an essential component in file based workflows and is widely used these days. This has triggered the need for a QC solution, which can auto-correct errors in order to save time and resources. It is based on the thought that if a tool can detect error, it can also potentially fix it. But auto-correction in the file-based world is a more complex process and should not be trivialized. A QC tool having in-built support for auto correction including transcoding has issues of its own. Transcoding and re-wrapping processes if not managed properly, can introduce fresh issues into corrected content leading to further degradation of content quality. Hence, it is not possible to fully rely on such auto correction flows. A more practical approach would be to reuse facility specific tools for encoding needs during the correction process. In such scenarios, the role of a QC tool is limited to baseband and metadata correction or setting the transcoder correctly. A smarter in-place correction strategy can also be adopted in case of uncompressed content. Having said this, there is still a set of issues, which requires manual intervention and thus cannot be auto corrected. Hence, the scope of QC tools for auto correction is limited but feasible for a set of issues provided we use the right tools, workflows and techniques.

You might also like...

Microservices Poised To Take Self-Serve To Another Level

Microservices enable broadcasters to find new ways to adopt, engineer, operate and maintain the value of their solutions. For vendors, microservices provides opportunities to offer what could essentially be a self-serve menu for clients rather than building bespoke workflows internally.…

Data Recording: Error Correction - Part 16

Error correction is fascinating not least because it involves concepts that are not much used elsewhere, along with some idiomatic terminology that needs careful definition.

Data Recording: Error Handling II - Part 15

Errors are handled in real channels by a combination of techniques and it is the overall result that matters. This means that different media and channels can have completely different approaches to the problem, yet still deliver reliable data.

Data Recording: Error Handling - Part 14

In the data recording or transmission fields, any time a recovered bit is not the same as what was supplied to the channel, there has been an error. Different types of data have different tolerances to error. Any time the…

Data Recording & Transmission: The Optical Drive - Playback - Part 12

Optical disks rely totally on the ability of the pickup to follow and focus on the data track. It is taken for granted that these mechanisms are phenomenally accurate, work at high speed despite being made at low cost and…