Mega Metadata Experiment May Affect the Future of Hollywood Production

Since the dawn of digital production, many links in the workflow chain have produced their own metadata. Now the “Production in the Cloud” project is making all that metadata useful.

Over the past year, The Entertainment Technology Center (ETC) at USC in collaboration with Disney ABC Television Group has been involved with a massive research undertaking called the “Production in the Cloud” project that used a USC student film, “The Suitcase” as its template.

The implications of this experiment to the future of Hollywood production are so significant that funding was contributed by Universal Studios, Technicolor, Google, Arri, Wipro, and Equinix, with a dozen more companies donating software and hardware.

The project’s purpose was to realize the long-hoped for promise of production metadata. Could they enable all facets of a complex film shoot to not only talk with each other during the production and post production workflow but also retrieve all the metadata created along the way so it could be fed into subsequent applications?

According to Erik Weaver, the program manager of the “Production in the Cloud” project at ETC, four major goals were specified at the outset:

  1. Create a “Next Generation Cloud-Based Workflow” by ingesting metadata directly into the cloud.
  2. Feed this information into a “Cinema Content Creation Cloud”, or C4, framework for accurate identification, called the “C4 ID system”, for tracking.
  3. Evaluate methods for establishing reliable interoperability between data sources, comparing unified metadata storage vs. data silo integration via framework. While doing this, identify metadata inputs and outputs for key production software and hardware.
  4. Maintain High Dynamic Range (HDR) video throughout the workflow and report on production considerations for filming with HDR including departmental training, workflow, color grading and mastering
The production flow chart for the

The production flow chart for the "Production in the Cloud" project

In addition, Weaver told The Broadcast Bridge, they decided to try out the feasibility of having a 360-degree camera on the set constantly streaming Virtual Reality back to the studio offices “In case an executive wanted to pop in on what was going on,” as he put it.

You can actually see this video here.

“To keep the budget from expanding out of hand, we decided to use a student film from the USC School of Cinematic Arts as our working model,” Weaver tells us. “In this case it was a 21 minute graduate thesis short written and directed by Abi Corbin called “The Suitcase”.

Although this cinematic gem cannot be released to the public since it’s headed for the festival circuit, I was granted a private screening. In “The Suitcase”, Ms. Corbin has woven a clever story derived from the murky facts surrounding a piece of luggage that was actually left behind at Boston’s Logan airport by Mohammed Atta, one of the 911 hijackers.

To integrate all the metadata generated during production, ETC turned to MarkLogic who make the only enterprise-level NoSQL (non-relational) database. MarkLogic’s technology has earned high marks by providing the servers that are now running the entire Affordable Care Act’s database after the disastrous launch of Obama Care.

“We learned a whole bunch of lessons from this running of ‘The Production in the Cloud’ project,” Weaver said. “All four of our original goals were accomplished, and we will be producing a White Paper about the results in the near future. We demonstrated interoperability between multiple systems and we learned how to multiplex the bandwidth from the on-set recording cart so we could upload the same copy of a file to our ETC offices that we sent to Technicolor, each with their own proper C4 ID to keep track of them. Our metadata wrangling, thanks to a lot of work from everyone on our team, proved very successful.”

Every sophisticated system used in today’s production and post production chain generates its own metadata, but as Weaver explains to us, the ‘Production in the Cloud’ project’s purpose was to create a Rosetta Stone to let them all talk to each other.

Metadata path in the

Metadata path in the "Production in the Cloud" project

“We’re not trying to replace the metadata generated by each individual system,” Weaver said. “Each one can keep their proprietary standard if they want. We are just adding a layer on top of it that translates everything into a universal identification system that is complimentary to what they are already doing.”

The CTO of Disney ABC , Skarpi Hedinsson, told The Broadcast Bridge that as a result of the project, they could clearly see the benefit of structured and descriptive metadata and semantics to digital applications that will help streamline production processes.

"The ‘Production in the Cloud’ project was an opportunity for Disney ABC Television group working with ETC to explore how sophisticated metadata processes that are common in digital workflows can be used to innovate traditional television and film production,” Hedinsson said.

“We used ‘The Suitcase’ to identify and collect an authoritative set of characters, locations, and storylines, leveraged a video annotation platform we built internally and indexed the authoritative metadata to the film’s time code,” he said. “So we had metadata in every scene describing who was on screen, what locations were featured in the scene, and what storylines were components of the scene.”

All of this technology scooping in many layers of information throughout the filmmaking workflow will help the understanding of the production process, but only if it doesn’t interfere with the creative input from the people who craft the final result.

Chris Witt was the freelance editor who cut “The Suitcase”, and he’s glad to say the gathering and tracking of all that metadata did not interfere with the creative process on his Avid Media Composer running software vers. 8.4.5.

“It didn’t hamper me at all,” Witt said. “I could see the process working in the background, but it was invisible to me.”

However, all that mountain of metadata came to his rescue in the middle of one hectic Saturday night as he was crushing to meet the deadline for an online color grading session.

“We had one vfx shot that was a number of frames too short, even counting in the handles,” he tells us. “We desperately needed to get a new plate over to the vfx team at Technicolor so they could create a new vfx proxy for me to cut into my timeline and a high rez version for the finishing bay. Having instant access to all that metadata let us call up all the elements and fix the shot without having to delay the grading session.”

As the new world of all-digital film production is teaching us, raw data is useless unless it is put into context as metadata. Now, as the ‘Production in the Cloud’ project has demonstrated, once that metadata can be identified, indexed, and recalled it becomes a reference source that can help the lessons of one production meet the challenges of the next.

You might also like...

Future Technologies: The Future Is Distributed

We continue our series considering technologies of the near future and how they might transform how we think about broadcast, with how distributed processing, achieved via the combination of mesh network topologies and microservices may bring significant improvements in scalability,…

Standards: Part 11 - Streaming Video & Audio Over IP Networks

Streaming services deliver content to the end-users via an IP network connection. The transport process is similar to broadcasting and shares some of the same technologies but there are some unique caveats.

Designing IP Broadcast Systems: Routing

IP networks are wonderfully flexible, but this flexibility can be the cause of much frustration, especially when broadcasters must decide on a network topology.

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

Future Technologies: New Hardware Paradigms

As we continue our series of articles considering technologies of the near future and how they might transform how we think about broadcast, we consider the potential processing paradigm shift offered by GPU based processing.