MAM is Dead. Long Live Media Logistics—Part 3

In the third and final part of BroadcastBridge’s MAM feature we contend that MAM as we’ve known it is dead and that today’s broadcaster and content delivery firm want a media logistics solution which encompasses all ingest, production, distribution and archive with rich metadata including rights. If so, are the tools in most MAM’s appropriate at ‘orchestrating’ all of these assets?

Here are the comments of Tony Taylor, CEO TMD.

TT: Many MAM solutions have been designed around a siloed approach. This has been typical of the way software has been developed in the broadcast industry for many years. Even now I find it incredible when I hear some of the stories of MAM implementations that have taken no account of joining up the business of media across organisations.

That joining up has to start with the metadata. The successful media businesses are those who realise the value of the metadata which exists alongside the content, and implement a MAM solution that uses it to the fullest extent possible.

There can be no argument that the future will be around file-based workflows in data centre environments. This depends upon metadata: acting on it, reacting to it and enriching it as it passes between and through facilities. The protection and enrichment of metadata has always been at the heart of any asset management system worth the name, and today it is the only logical place to put the workflow orchestration layer.

If workflow orchestration is about drawing on and adding to metadata, why would you even consider putting orchestration in a separate system?. It has to be in the system which is charged with holding the metadata.

Content preparation and delivery firms are required to deliver assets to an ever increasing variety of platforms. How have manufacturers helped content companies gear up for life in a multi-platform world?

TT: You have to think in terms of layers. At the bottom is the hardware: the servers, the encoders and transcoders, and the content delivery networks. Above that is a control layer, which tells the hardware what to do with each piece of content.

Above that is the business layer. This is where executives look at the economics of the operation and make commercial decisions. In a modern media enterprise, these executives should be able to make decisions based on purely commercial considerations, not what the technology allows them to do.

The middle layer is the asset and workflow management. Its rich metadata captures all the information on the content: what rights are available; when and where it can be shown; what content needs to take priority through the encode farms and more. Most important, the asset and workflow management system should both be controlling the hardware at the bottom, and reporting and responding to the business systems above it.

Put simply, a CEO should be able to look at one screen – familiar to him or her because it is in the enterprise management layer – and make a decision to, say, put a particular programme on iTunes. That decision should pass automatically to the workflow management system which will draw on the technical metadata to determine precisely which processes are required, and implement them at the right time, again fully automatically.

What are the tools to create, deliver and store files and metadata for broadcast, VoD, mobile and web in one workflow?

TT: The very simple answer to that is a rich metadata schema. If the asset and workflow management system knows all there is to know about the content, from rights to resolution, then it can command whatever other equipment is around to make all these things happen.

It is, frankly, ridiculous to think that the media industry can think about multi-platform delivery in anything other than a single workflow environment. Conceptually, you are delivering your content to your audience. It is one concept, so how can it be anything other than one workflow environment?

There are many tools that exist to achieve this, from editors to transcoders. But the primary tool to ensure efficient automated media business process management is content intelligence, relying on the metadata. There is no need to compromise if you use the biometrics inherently encapsulated in the metadata and content.

How important is the ability to integrate tools from a range of vendors?

TT: Broadcast engineers have always chosen best of breed solutions: the right set of functionality and performance for a specific installation. Do we really think anyone wants to change that?

However, as we move into the IT-centric and increasingly the cloud era, we have to find ways to maintain and simplify that choice. One of the biggest challenges is scaling services up and down to cater for peaks and troughs in volumes as well as introducing new technologies and services. At TMD we have designed, integrated and implemented a platform called UMS – unified media services – which is a simple approach to service-oriented architectures that enables broadcast and media organisations to cost effectively integrate third-party technologies.

There is of course the FIMS standard as a good open foundation, but this does not answer all of the needs of the current broadcast customer. So UMS provides a service bus to support integrations, which includes FIMS, proprietary APIs and other methods to decouple the technology from the operations, allowing users to choose best of breed hardware yet still operate it from automated, metadata-driven workflow orchestration.

Is it best to adopt a single system or opt for a modular workflow?

TT: It is best to implement a system that fulfils the real commercial needs of the media company. In some cases that can be done in a one-stop shop solution. In most cases, I suspect, it will best be served by components from a number of top vendors, brought together under a metadata-driven environment. Either way, the question should never be “who do I buy this from?” but “what do I need to make money?”. It has to be looked at from the business perspective and not simply the technology preference of an engineering or IT department.

TMD's Tony Taylor

TMD's Tony Taylor

You might also like...

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…

Essential Guide: Next-Gen 5G Contribution

This Essential Guide explores the technology of 5G and its ongoing roll out. It discusses the technical reasons why 5G has become the new standard in roaming contribution, and explores the potential disruptive impact 5G and MEC could have on…

Audio For Broadcast: Cloud Based Audio

As broadcast production begins to leverage cloud-native production systems, and re-examines how it approaches timing to achieve that potential, audio and its requirement for very low latency remains one of the key challenges.