Monitoring & Compliance In Broadcast: File Based Monitoring In Production Systems

File based monitoring tools sit at the heart of broadcast workflow. As production requirements evolve to embrace remote production and multi-site teams, such systems must also evolve to meet the new challenges.
With increasingly complex workflows and requirements for multiple output file formats, the need for file based monitoring and quality control (QC) in production environments has never been greater.
Minor errors in live productions, while undesirable, are sometimes forgiven. Pre-event rehearsal and use of monitoring and quality control at this stage can normally catch any potentially catastrophic failures before the live event. Monitoring during the event is also useful to quickly identify where any errors are occurring in the production chain, and to see if a switch over to backup systems may be necessary (if they are present!).
In main production, it is relatively rare that the costs associated with a “re-do” of a production are calculated, should unforeseen errors creep in and not be found before the delivery stage. These can be very expensive not only in money, but also time. Back in the day producers often could be heard to say “we’ll fix it in post”, cue engineers falling to the floor.
With today’s complex workflows and the need for multiple output delivery formats, including higher resolutions, with greater accessibility and localization to add to this, and all produced at very high speed, picking up any errors within the process is becoming vital.
Selecting Monitoring Points
When considering monitoring, there are a number of basic decisions and factors to take into account right at the beginning, and QC and monitoring should not be an afterthought at final delivery stage.
Firstly, decisions on monitoring points throughout the production systems should be taken. Where should monitoring take place, and at what depth? At what speed does it need to happen, and where does the reporting go?
When filming live, attention is often paid to ensuring camera setups and mixing is all correct, however it is more than possible that when the output, either locally mixed, or being sent to a remote production hub, travels over networks, errors can creep in during that travel.
Consideration should therefore be given to monitoring files as they are ingested, whether that be to cloud based storage, on-premise central servers, or individual locations. Different external production locations may all be sending files to a central location at the same time, and identifying if one or more of these has a problem needs to be done quickly and efficiently. Automated monitoring processes at this stage can solve this.
For many media file formats, (not all), current monitoring systems are capable of analyzing growing files during the ingest process and if errors are detected, enable the file transfer to be stopped, saving time and expense.
Once in production hubs, transcoding operations to multiple file formats will almost certainly take place, and certain errors can get “baked in” during the transcoding process. These are by definition, difficult, if not impossible to fix.
As part of the editing process, more video and audio errors can be created, therefore QC can be valuable at this stage. Finally, pre delivery, QC should be performed to ensure the integrity of the file delivery. In many cases where a production is delivering to a larger media organization, those organizations will require a QC report to accompany the delivery, possibly with specific test requirements in the report that they as an organization see as essential to ensure smooth error free playout, whether that be via streaming or otherwise.
In certain countries, there are also legally required checks, such as Photosensitive Epilepsy requirements in the UK and Japan.
Evaluating Requirements
At any of these points in the production process, decisions should be taken about the depth of checking that is to be performed at each stage.
It could be as simple as basic file conformance checking that the file is in fact the expected format, checking at container level, checking the metadata is correct, requiring lightweight QC testing, or it could be more in-depth testing, requiring the QC system to perform a full decode in order to check video and audio errors. The depth of the checking will have a direct effect on the need for processing power within the QC system, and the speed with which the checks can be performed.
Video and audio checks may be of the obvious kind, such as blurriness and blockiness, black frames, loudness, even “Media Offline” appearing as part of the edit, or they may be of a kind less obvious to human eyes, where auto-QC may pick them up. If subtitling or captioning is present, this may also require checking.
QC Systems
In complex production systems, with multiple hubs and supply chains, generally some form of MAM or orchestration system will be overseeing the various processes, and directing the chains automatically. In these cases, when and what to monitor will come in as part of those management chains, and automated control of the chains may happen as a direct result of analysis of the QC reports.
It may be as simple as the QC system monitoring a folder and performing QC on any file it sees there, perhaps with initial analysis to decide which checks to perform, based on the type of file it sees, and redirecting if a file fails the QC check, or there may be much more sophisticated control and management happening, generally using API calls.
Reports can also be exported so that data from the report can be utilized by other systems, for example, report data imported into editing timelines.
Automation & AI
Typically, instructions on what to do with a file, or a QC report according to the QC result are currently fairly straightforward, however with the advent of these more complex workflows, and increased possibilities for automation, Machine Learning and AI, we are starting to see potential for more “active” monitoring and QC. AI and Machine Learning can be used very effectively in identifying languages in audio tracks, and in the QC of Captions and subtitling.
Currently there are still certain elements of file based QC and monitoring that humans do best, and there may always be creative decisions taken, that automated systems may see as a fault, but human eyes can see as deliberate. However, QC and monitoring is becoming increasingly sophisticated due to inclusion of AI, which holds promise for increased automated decision making.
Automated analysis of content to identify elements within the picture, either desirable or undesirable, is within the scope of currently available systems. As AI makes its way into other areas of production, perhaps AI analysis of the script will enable it to take creative decisions in QC and monitoring systems.
In the future, broadcasters and service providers will increasingly rely on Artificial Intelligence (AI) and Machine Learning (ML) technologies, combined with computer vision techniques, to improve content quality.
From a recent White Paper on The Evolution of Content Creation and Delivery:- “These technologies have a huge role to play in helping service providers develop more intricate QC algorithms. Next-gen algorithms will utilize these technologies along with natural language processing, visual text recognitions and other methods to accurately detect video and audio that includes violence, explicit content, alcohol, smoking and more. General content classifications such as the nature of advertisements, celebrity identification, and presence of brands or objects within content is also becoming simpler with these new tools.”
However, VOD and OTT delivery require a much faster and wider variety of metadata generation for content classification. In globalized systems, detailed content classification is key to allowing fast and accurate localization of content.
Another promising development is the potential use of AI to assist with the corrective actions needed once errors are found by the automated QC system. Quite extensive training of the AI is needed to achieve this, but models for most commonly used checks and file formats can lead to the ability to provide automated fixing of certain errors.
Managing Multi-Site Resources
On the control and management side, decision making on resource requirements is becoming sophisticated, typically within cloud-based environments, and this is contributing to effective use of processing power.
Modern media management systems are capable of analysis of multiple production workflows across multiple sites, and controlling the effective usage of networks, server instances, storage, and processing power required. While processor usage for most organizations generally remains fairly steady for monitoring and QC systems during normal operations, the ability to spin up extra QC resources to cope with larger live events or specific projects such as archive digitization, just for the duration of the greater need, is valuable.
For larger operations who have global resources, the ability to manage throughput across different regions, to take advantage of peaks and troughs of usage is becoming more of a reality in production operations, though consideration still needs to be given to potential networking issues and latency. A combination of local copy or transfer over networks can often represent the best way of working. For this to be effective, as a minimum, all QC and monitoring systems should be on identical software versions, and should also carry the full range of test profiles or test plans needed across the organization.
Deep consideration should be given to storage locations and how monitoring systems can best access them. In addition, if external organizations are delivering assets from multiple locations, security and encryption, generally via DRM, may require QC and monitoring systems to be able to decode such encryption in order to perform checks on the incoming files.
Conclusion
Finally, while the application of full automation in production systems is providing great benefits in terms of speed of production, and offers production staff greatly extended creative possibilities, it is also essential to have high quality and frequent monitoring at all stages of the production process.
If something does go wrong, speed is also of the essence when identifying where and when the error has occurred. Without this, the chances of being able to fix the problem in automated workflows before errors concatenate, are limited.
Part of a series supported by
You might also like...
Content Steering Goes Mainstream After Standardization
Tests have confirmed that content steering will boost performance and resilience of multi-CDN delivery networks. Following standardization by the DASH Industry Forum and then ETSI, it is becoming integral to streaming infrastructures, working autonomously and upgraded transparently in the field.
Microphones: Part 10 - Mid-Side (M-S) Recording And Processing
M-S techniques provide useful sound-field positioning and a convenient way to check mono compatibility. We explain the hard science behind this often misunderstood technique.
Innovating The Interactive Sports Fan Experience - Monumental Sports Network Are Early Adopters
As we continue our dive into the new frontier of Interactive Rights we explore the first steps taken by an early adopter. Monumental Sports Network in Washington are far from implementing a complete portfolio of interactive enhancements to their broadcasts…
Monitoring & Compliance In Broadcast: Monitoring Cloud Infrastructure
If we take cloud infrastructures to their extreme, that is, their physical locality is unknown to us, then monitoring them becomes a whole new ball game, especially as dispersed teams use them for production.
Neutral TV Operating Systems
TV OSs have become pivotal to both smart TVs and streaming services as consumers continue to cut the cord. There is growing interest not just among TV makers but also major streaming and advertising platforms in neutral TV OSs independent…