Reimagining archive as integral to production, preparation and delivery

Content owners have traditionally archived material as a resource worth keeping in perpetuity, but this simplistic approach is no longer fit for purpose. Invariably though, detailed knowledge about the content is unavailable or it is in a state where it cannot be easily unlocked. If assets are stored on disparate hard discs, siloed servers or on shelves as tape, there will be considerable wasted time, effort and cost in locating, retrieving and collaborating on content creation and onward distribution. Time to air is impacted and the archive is effectively devalued. An archive system used solely for preservation or for legal compliance realises very little of its true value and barely justifies the ongoing cost of acquisition, documentation and maintenance. Only by reimagining the archive as a unified repository of assets, which are integral to production, content preparation and content delivery workflows, can its value be transformed.

How can you do that?

The starting point has to be the particular needs of the organisation. What do you want the archive to achieve in terms of production and distribution? Is there a need to keep content available for production or to organise it for on-demand and catch-up services? Which assets should be considered 'live' after transmission to optimise storage?
Armed with this strategy, organisations can begin implementation by structuring and generating metadata models.

While many facilities will have installed a MAM system for a portion of their workflow, these systems are unlikely to have the scale or intelligence to search for or relate files to one other in any efficient, integrated or meaningful way.
This is because metadata is vastly more complex today than in the early days of tapeless workflows, when a single metadata set equated to one media file.

With the expansion of file-based media, multiple files were required to create different formats (e.g., distribution versions) or resolutions (e.g., proxy, high resolution) of the same content. To that mix was added subtitles, language tracks and captions for various outputs.

Metadata modeling

The situation continues to increase in complexity, to the extent that an individual file will make little sense in isolation. Taking the broad asset for a soccer match for example, will include files for different resolutions of that game plus related logs and metadata. Deeper subsets might contain match highlights or feeds from camera angles, pre- and post-game interviews, press conferences, Twitter feeds and other user-generated input. All of these files and associated metadata are bracketed by the same asset.

Similarly, a drama will be copied as a master file with different audio and subtitle tracks in different languages. There may be different length versions, re-edits for compliance, a AS-11 file for delivery to UK broadcasters, a Dolby audio file for theatrical release and so on. Related promos, shot lists, time-coded transmission information, scripts, production notes and cast lists all fall under the same asset.

A facility’s asset management system should be able to combine these different, complex asset models as well as the links between them in a way, which is easily configurable to fit the user’s needs and which evolve over time as new distribution needs arise.

Smart MAM

There are further attributes of a successful integrated MAM. For manual input, users should be driven by the system so that they enter the right metadata at the right time, depending on the context and type of content. Items like controlled vocabulary, thesaurus and glossaries really help in this context.

When used in conjunction with other types of logging, such as scene detection, facial recognition and semantic interpretation, it is possible to enrich content automatically with relevant metadata and to clean up raw metadata.

The MAM should be able to seamlessly import external data feeds. There are many examples around XML exchange standards like SportsML, alongside feeds from Opta (which provides teams, rosters, logs and statistics), that bring in valuable data in the context of sports material. Similarly, NewsML dope sheets provide extra valuable information about topics, location, context and broadcast rights.

With genealogy and metadata inheritance, it is possible to trace and correctly action broadcast rights and descriptive metadata. Ideally, as you edit a new asset, the user won’t need to re-enter all the metadata again and again, reducing labour and risk of error. Search and browsing technologies such as federated search or navigation of links between assets will help to quickly locate relevant content and enrich production with more archived content.

A top down vision

Regardless of media, whether for news, sports, radio, long form programmes or every piece of content that passes through a facility, it is imperative to think beyond storing files on tape libraries and disks and to think in terms of freeing that content up to work better for you with addition of rich, consistently applied metadata.

For this, a flexible asset management solution is required, one that is built around the archive to unlock all of its potential. This will be a MAM solution that has the intelligence to recognise links between metadata files, the power to search, browse and exchange assets, and that has the scale to expand as production and business planning demands. All of that requires a top down vision that removes archive from the sidelines and places it at the heart of future business.

You might also like...

Standards: Part 13 - Exploring MPEG4-Part 10 - H.264/AVC

The H.264/AVC codec has been very successful. Here we dig deeper into how profiles and levels work to facilitate deployment of delivery systems and receiving client-player designs.

The Meaning Of Metadata

Metadata is increasingly used to automate media management, from creation and acquisition to increasingly granular delivery channels and everything in-between. There’s nothing much new about metadata—it predated digital media by decades—but it is poised to become pivotal in …

Designing IP Broadcast Systems: Remote Control

Why mixing video and audio UDP/IP streams alongside time sensitive TCP/IP flows can cause many challenges for remote control applications such as a camera OCP, as the switches may be configured to prioritize the UDP feeds, or vice…

Future Technologies: Autoscaling Infrastructures

We continue our series considering technologies of the near future and how they might transform how we think about broadcast, with a discussion of the concepts, possibilities and constraints of autoscaling IP based infrastructures.

Standards: Part 12 - ST2110 Part 10 - System Level Timing & Control

How ST 2110 Part 10 describes transport, timing, error-protection and service descriptions relating to the individual essence streams delivered over the IP network using SDP, RTP, FEC & PTP.