AI is key to the metadata that empowers Avid’s cloud-based collaboration.
Avid has forged the path toward implementation of Artificial Intelligence in post-production, and their use of AI is just beginning.
Artificial Intelligence (AI) is making major inroads to post-production, and nowhere more significantly than in Avid’s editing systems.
When speaking with David Colantuoni, vp of product management at Avid, I started by asking how he would define what AI is.
“In the Media and Entertainment context, we look at AI as using machine learning to reach out to massive amounts of data and streamlining it so it can become practical to use it in productions,” he began. “In addition, it is used to relieve humans of repetitive procedures by letting computers supervise automated tasks.”
And Avid has been in the forefront of putting these concepts to good use.
“We’ve had AI functionality in our systems for many years,” Colantuoni said, “with our ScriptSync and PhraseFind features that let an editor search for spoken words in a general phonetically indexed database. We were using AI before the expression came into general usage.”
PhraseFind allows editors to search their Media Composer project phonetically – that is, by the sounds of the words.
In fact, just to jog peoples’ memories, ScriptSync, or the ability to reference raw footage to its position in a written script was first introduced in 2007.
It was joined by the ability to search for the appearance of specific words with PhraseFind in 2011.
Although well received, due to licensing machinations behind the scenes both were discontinued in 2014.
But ideas as downright useful as those two could not stay down forever, so ScriptSync and PhraseFind re-appeared in 2017, each upgraded to version 2.0 and each with an upgraded license fee offered either separately or bundled together.
“Now we are more and more getting into ways to leverage the computing power of the cloud by using Microsoft Azure,” Colantuoni told me. “That is giving us the ability to help people responsible for logging massive amounts of source video for productions such as, for example, Reality Shows, based on several criteria.”
Using video uploaded from MediaCentral to the Azure cloud, you can process the visual information and turn it into metadata. Then you can search that metadata for specific words, faces, or even objects.
“Say there is a specific red car you need to use in a scene, you can have the system find all instances of that red car by searching the metadata identifying it,” he said. “But it doesn’t end there.”
Avid has extended their metadata information into quality assurance. “For example if PhraseFind has identified some words that should be place in a certain scene,” Colantuoni described, “the operator can access the linked metadata to verify that the associated closed captions are appearing in the proper shots.”
It all has to do with MediaCentral interfacing with a database because of the metadata component that enables this kind of multi-level search.
“Metadata is going to be providing the foundation of ever more elaborate search capabilities as we learn to leverage AI more extensively in post-production,” Colantuoni finished up. “But, of course, it is always going to require human intelligence to give the final result meaning.”
You might also like...
HDR offers unbelievable new opportunities for broadcast television. Not only do we have massively improved dynamic range with the potential of eye-watering contrast ratios, but we also have the opportunity to work with a significantly increased color gamut to deliver…
In the fourth and final part of this series, we wrap up with an explanation on how PTP is used to support SMPTE ST 2110 based services, we dive into timing constraints related to using COTS (Commercial Off-The-Shelf) hardware, i.e.:…
The recent launch of Apple’s TV Plus service bulked up with original TV shows costing $6 billion to produce has disrupted global attempts to unify streaming behind a common set of protocols for encoding, packaging, storing and playing back video d…
This past summer the NBA did a little experimenting using 5G and mobile phones to cover their summer league. This is not User Generated Content (UGC) by any means. It also was not an off the shelf deployment of 5G…
In the previous two parts of this four-part series, we covered the basic principles of PTP and explained how time transfer can be made highly reliable using both the inherent methods IEE1588 provides as well as various complementing redundancy technologies.…