TMD's Mediaflex Content Intelligence
The importance of metadata cannot be stressed too highly: in the modern, software-oriented media world it is the absolute core of the technology, the glue that makes everything work. Software-defined workflows need to know all about the content to be able to automate workflows and also make the content easier to find. Paul Wilkins, Director - Solutions & Marketing provides TMD’s view.
The Broadcast Bridge: How do you see the value of metadata changing?
Paul Wilkins: The value changes because every single media enterprise will move to software-defined workflows, so will come to depend upon metadata. What then becomes critically important is that the metadata management layer – which is where the workflow orchestration resides in a well-designed system – is infinitely flexible to meet the unique needs of each media enterprise.
The metadata management layer also needs to become even more intelligent. That will include more automatically created metadata, more use of the metadata to plan complex workflows, more ability to exchange metadata with other systems, when transferring content, and to make consumer user interfaces more powerful.
The Broadcast Bridge: What types of metadata does your product capture?
Paul Wilkins: The essence of a good asset management system is that its metadata schema is infinitely flexible. For a successful implementation, the asset management software should deliver a solid core metadata structure appropriate for the application and the type of media business, but it must also allow the user to adapt that structure, adding fields as required, to meet the specific requirements to be extensible.
There are, of course, many types of metadata, and no two organisations will need exactly the same structure. But we can divide metadata into two broad categories: technical metadata, which aids automation; and descriptive metadata, which aids discovery.
One of the key issues we have found in the past, particularly during data take-on from legacy systems, is that there is sometimes confusion between the two. To quote an obviously disastrous example, one major broadcaster had made the number of audio channels a descriptive text field not a prescribed technical one. That resulted in hundreds of different descriptions for a stereo soundtrack, a nightmare to translate to a modern, rigorous database structure.
The Broadcast Bridge: Is there an example of how metadata is incorporated into a typical workflow?
Paul Wilkins: We recently implemented a system at Astro, the leading satellite service in south-east Asia and based in Malaysia. That country has strict codes of practice on decency, and all content, immediately after ingest, goes to compliance to be edited as required.
But Astro wanted to give channel controllers and genre commissioners the ability to set their own rules around compliance editing. These are carried in the content metadata, and allow multiple versions of a programme, for example a primetime version, a late night version and an online version to be created.
Looking at technical metadata, TMD recently provided a new system for Swedish broadcaster MTG, for its London playout centre. The TMD Mediaflex platform provided workflow orchestration as well as asset management. One of the key functions this enabled was that, while MTG’s playout centres in London and Riga are generally responsible for different channels and markets, each can provide a comprehensive disaster recovery service for the other. So alongside interfacing with as many as 10 sub-systems from different vendors in London, the asset management platform also had to use metadata to drive the backup of playlists, EDLs and promotional data as well as content to and from Riga.
Paul Wilkins: The metadata management layer needs to become even more intelligent
The Broadcast Bridge: What’s next? How might metadata be used in the future as a business driver?
Paul Wilkins: I said earlier that there are two sorts of metadata: technical for automation and descriptive for discovery. There are already plenty of examples of technical metadata being used to seamlessly link content from creation to consumption, setting the optimum format at each stage of the process.
The next stage will be for an extension of descriptive metadata. More information will be gathered and created, some of it perhaps automatically. Intelligent systems might “listen” to the script and “view” the content to build a comprehensive description and audience rating of a programme.
That rich metadata could then be exposed to consumers to help them identify their sort of content. Ultimately this, too, could be automated. Intelligent set-top boxes and online clients could build a profile of the user, learning what sort of content they like. Today it might be possible for a consumer to tell a content delivery service “send me a Vin Diesel movie”: in the not too distant future it should be possible for that consumer simply to say “show me a movie I will enjoy”. When that happens, it will be metadata that makes it happen.
You might also like...
In the age of eager reporters surfing the internet for scandalous scoops, who is helping defend TV station newsrooms by detecting and tagging fake pictures and videos before they air, and how are they doing it? Hint: It’s not ‘g…
The arrival of 5G brings both opportunities and challenges to communications, media and entertainment companies, as well as the original equipment manufacturers (OEMs) working to support them.
Tests in Washington D.C. ABC and FOX affiliate newsrooms will reveal the first data on new NextGen TV systems and workflows.
Without doubt, virtualization is a key technological evolution focus and it will empower many broadcast and media organizations to work differently, more efficiently and more profitably.
Thanks to Over-the-Top (OTT) streaming video, content owners and broadcasters have a very different relationship with the end consumer – often a direct one.