If Metadata Isn’t At The Heart Of Your Workflow, You’re Doing It Wrong

“May you live in interesting times.” The expression suggests there is more than one interpretation of the word interesting. There can be no doubt that the media industry is living in those interesting times as it tries to navigate its way through a radical change in the way media is delivered to and consumed by the end customer.

Those customers are increasingly demanding that they be able to watch what they want, when they want, in the format that they want, at the time that they want. This puts enormous pressure on the companies that create and distribute that content, as they now must prepare an ever-increasing number of variants of each piece of material in order to deliver media to a huge number of end points and maximise their revenue across this fragmented audience.

The fundamental problem here is that the “Total Available Market”—the target market for a particular piece of media (or series)—is not expanding dramatically. So the total potential revenue for that piece, across all of the delivered versions, is not expanding either. But the cost of creating this myriad of deliverables is increasing in a largely linear fashion, and the process of adding a new channel or programme is slow and expensive. The only way to converge these realities into a successful outcome is to minimise the number of manual steps and automate as many of the manufacturing processes as possible—as has been done in almost every other product-creating industry, and the key to automating those processes is to harvest and utilise the available metadata. The diagram below outlines a typical modern workflow. We’ll examine how automation can streamline each functional area. 

Figure 1. Note the multiple automatic QC stages in this workflow. Each step along the way can add metadata, which can enable downstream content to be automatically assembled and quality checked.

Figure 1. Note the multiple automatic QC stages in this workflow. Each step along the way can add metadata, which can enable downstream content to be automatically assembled and quality checked.

Onboarding: QC At The Source Is Required

Most intermediaries (such as MVPDS) in the content delivery food chain have their own set of delivery specifications, which they communicate to companies “upstream” of them via an SLA (Service Level Agreement). This is a vitally important document, as it describes exactly how a piece of media must be formatted in order to be accepted by that intermediary. Failure to strictly adhere to the specifications in that document will result in the media being rejected upon submission.

Some studies have suggested that as many as 70% of submitted files are rejected because of some error in construction. This is problematic for the supplying entity, as that content must be corrected and re-submitted, both of which add time and expense to the process. In many cases, such remediation requires that some number of highly skilled (and therefore expensive) operators review the video and audio content in real time in order to make corrections. Equally expensive engineering personnel may be required in order to correct mundane errors in file wrapper construction. This is exactly the opposite of the desired production workflow.

Many of these errors can be detected by automated file-based QC products. These products can identify clear problems (wherever they can be definitively identified by automatic means) or can flag areas of concern so that the expensive personnel only need to review the problem areas.

A classic example of this second activity would be in the area of black or silence detection. Automated processes can certainly detect that video has gone to black or audio has gone to silence for a period that exceeds some pre-programmed value. While current products can easily detect black and/or silence, they are unable to accurately determine whether this black or silent section is the result of a technical error or is, in fact, part of the creative intent of the producers. This is certainly an area in which the manufacturers of file-based QC products are considering the use of machine learning, so we will no doubt see improvements in this particular facet over time.

The ultimate goal of these products is to generate a simple pass/fail flag for the submitted file, along with a more detailed report that indicates all of the checks performed along with the results of those tests for this specific piece of media. The flag, and more importantly the report, instantly become valuable metadata for automated processes to act upon further down the workflow. A fail flag can instantly reject the media, sending a report back to the supplier to allow them to more quickly resolve any issue. Even in the case of a file passing the QC stage, the results of the tests can be harvested by the metadata management part of the workflow orchestrator, so that decisions can be made further down the processing chain.

It should be noted that there is no standard format for the reports generated by the various manufacturers of QC software, so the metadata processing agent in the orchestration layer must be able to harvest metadata from several different report formats. In fact, many organisations utilise more than one QC product in order to minimise the number of “false positives”, so the metadata system will need to be able to harvest metadata from these systems simultaneously, and somehow correlate/collate the results for further use.

Initial Processing: Normalization And Metadata Harvesting

The first process that a piece of approved media will encounter in the workflow will almost certainly be transcoding. Each MPVD will have their own internal “mezzanine” format, which streamlines downstream processes by reducing the number of input variations they must contend with. The management system can use metadata harvested from the QC system (via some agreed to API) to select a specific transcode profile and thus provide source material to later processes in the mezzanine format.

A sophisticated metadata management system can use the harvested metadata (which may include indication of ad breaks) to drive the transcode directly, removing another stage of human interaction. As in many processing stages, every such process should add (or at least modify) the metadata associated with the new output format, and the management systems must be able to identify these variants as separate assets, or errors are almost certain to occur. Metadata is crucial to the operation in this new paradigm, so the metadata management system should harvest and store as much information as possible from the processes in this stage – this is particularly important as we move to the conforming and packaging phases.

To maximize value, today’s content is distributed to divergent audiences and geographies. A proper metadata management system can provide useful information so the required changes in imagery and languages can quickly be made.

To maximize value, today’s content is distributed to divergent audiences and geographies. A proper metadata management system can provide useful information so the required changes in imagery and languages can quickly be made.

Conforming And Packaging: There Is No One Size Fits All

Remembering that the target is to maximise revenue by delivering to different end points and geographies, we must recognize that there will always be a need to edit/conform material for a specific delivery. Airline cuts often require that certain scenes – especially those involving crashes – are removed from that variant. Cultural differences may mean that some content is considered objectionable in a specific geographic region.

This is likely an area where human interaction will still be required, but even here, the metadata management system can provide useful information to the editors to speed up the process: A version of an English movie intended for distribution into French speaking geographies can be created by simply adding a French dialog track to the video for the English version. Of course, this new version must be identified separately to the original – once again, metadata management is king – this is exactly what IMF was designed for – versioning material via the use of extensive metadata. Considering just how many localized variants are required in a modern workflow, automation here becomes a significant improvement in throughput. If a system is designed appropriately, it can even be possible for the management system to utilize parallel processing and generate the multiple variants simultaneously, reusing as many of the individual assets as possible - as identified by the management system using its metadata–driven media intelligence.

A further complexity in non-subscription delivery models is the insertion of audience-specific advertising. In some cases, the audience-specific ads are placed directly into the media file (which must then be tracked as a separate item as detailed above). In other cases, no ads are inserted during the main processing, but metadata and in-band signalling methods are used to indicate exactly where an ad break begins and ends. Downstream processes then insert the appropriate ad into that slot at run time, and the appropriate metadata is delivered back to the central management software so that clients may be billed for the ads that ran on each delivery pipeline at each specific time. This is a more sophisticated system, to be sure, but is flexible enough to change ad payload at the last minute, maximising ad impact and positively affecting revenue.

Any QC process must be highly accurate to help ensure all requirements of an SLA are met. This requires proper metadata.

Any QC process must be highly accurate to help ensure all requirements of an SLA are met. This requires proper metadata.

It will often make sense to add a final automated QC process after this final packaging phase, to validate that the format requirements of the final delivery system(s) are met. In this case, the metadata generated by the QC system is harvested and used for the enterprise’s business systems.

When evaluating management and orchestration systems, in addition to raw processing capabilities, consideration should also be given to any specific solution’s ability to scale its functionality as a media company’s throughput increases. As mentioned earlier, multi-screen, multi-format media production is the new norm and the whole point of workflow automation systems is to allow a media company to process media more efficiently—both in terms of raw cost, but also in speed of execution.

A company that can publish a new channel more quickly and cost effectively than a competing company is going to continue to gain market share, which in turn means that their workflow automation systems must be scalable to match that success. This means that any desired solution should be constructed on a service-oriented architecture, which decouples the actual software processes from the number of platforms available to run them—whether they are on-site, virtualised in the cloud, or some hybrid of the two.

No matter what the chosen solution may be, it is abundantly clear that these workflow systems are only as successful as their metadata harvesting, management and interrogation systems. All metadata adds value, so this process should be of paramount importance when evaluating workflow orchestration systems. 

Kristan Bullett is joint Managing Director at Piksel.

Kristan Bullett is joint Managing Director at Piksel.

You might also like...

Wi-Fi Gets Wider With Wi-Fi 7

The last 56k dialup modem I bought in 1998 cost more than double the price of a 28k modem, and the double bandwidth was worth the extra money. New Wi-Fi 7 devices are similarly premium-priced because early adaptation of leading-edge new technology…

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

NAB Show 2024 BEIT Sessions Part 1: ATSC 3.0 And TV RF

A full-time chief engineer in good relationships with manufacturer reps and an honest local dealer should spend most of their NAB Show time immersed in BEIT sessions. It’s an incredible opportunity to learn from and personally question indisputable industry e…

Audio For Broadcast - The Book

​Audio For Broadcast - The Book gathers together 16 articles into a 78 page eBook which explores the science and practical applications of audio in broadcast.  This book is not aimed at audio A1’s, it is intended as a reference resource for …