AWS, BBC, Adobe, And Others Introduce Open Source Framework For Fast-turnaround Media Workflows

At the 2024 International Broadcasting Convention (IBC), Amazon Web Services (AWS) will unveil the Cloud Native Agile Production (CNAP) project, an open-source initiative aimed at accelerating the creation and delivery of media during live production.

Developed in collaboration with BBC’s research and development (R&D) team, Sky, and AWS Partners, including Adobe and CuttingRoom, CNAP debuts as more news, sports, and entertainment productions look to develop ancillary content for live programming. CNAP’s interoperable framework provides a single virtual store for live content, making it easier to migrate production to the cloud so that creators can work with their preferred toolsets to rapidly create video highlights clips and packages for integration into live content, social sharing, and other applications.

AWS will spotlight an end-to-end CNAP workflow at IBC on the AWS Stand. The demonstration includes partner technologies from Adobe, CuttingRoom, Drastic Technologies, Techex, and Vizrt, which collectively support essential fast-turnaround media workflows demands spanning low-latency video ingest through to playout, file import/export, and web-based and craft editing.

Cloud Native Agile Production
For the project, BBC R&D provided its open-source Time Addressable Media Store (TAMS) specification, which is publicly available on github, and stores, queries, and accesses segmented media over HTTP. AWS then began building on top of the modern open, API-driven architecture to ensure AWS customers and partners could leverage the CNAP framework to deploy custom TAMS implementations in AWS.

Presenting a cost-efficient alternative to the lift-and-shift, file-centric workflows deployed today, CNAP limits time spent processing content. It supports a serverless, chunked media store approach with Amazon Simple Storage Service (Amazon S3) providing the underlying storage layer. Amazon S3 enables a ‘store once, use many’ approach to repurposing media, so simple edits can be expressed as a metadata ‘publish’ rather than a new asset or exported file. This strategy reduces storage duplication, time spent processing storage, and the volume of space required for the same workload.

Cloud Native Agile Production Proof-of-concept Workflow On Display At IBC
The AWS CNAP demo at IBC provides an end-to-end workflow from ingest, through editing content, to gallery playback through a vision mixer. The tx darwin live media processing platform from Techex ingests feeds from the nearby Newsroom in the Cloud demo, including a live camera on the stand. Once registered in the open source implementation of the TAMS store, the chunked media is stored on Amazon S3, and a React web application hosted on AWS Amplify visualizes the content.

All content in the TAMS store is accessible to view and edit via the CuttingRoom video editing platform web user interface. A Drastic Technologies plugin then enables content editing on a virtualized Adobe Premiere Pro workstation. Using a Vizrt Vectar Vision Mixer, content from the store can be played back and mixed with live feeds. Showcasing the different ways that TAMS store content can be distributed, the demo also illustrates a TAMS store content stream to a mock news site.

A Look At Cloud Native Agile Production’s Beginnings And Its Future Roadmap
CNAP originated out of an alignment between the BBC and AWS on the BBC’s TAMS chunked media store concept. As the two companies collaborated, they realized that the TAMS store framework could solve a larger industry challenge and set out to build CNAP with the help of multiple AWS teams. The goal was to build an open-source implementation to deploy the TAMS store in AWS, enlist partners to join the effort in April 2024, and demo an end-to-end workflow at IBC.

Prior to IBC, AWS and BBC R&D held a special event to brief partners on the project and ideate what an end-to-end IBC demo might look like. They encouraged participants to experiment with the API during a hackathon, which resulted in new capabilities such as the ability to import existing media from TAMS into the CuttingRoom web-based editing tool.

CNAP is still in early development, and the team behind it is exploring how they might apply live AI analysis to the workflow in the future, in addition to other improvements.

You might also like...

Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs

Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.

Future Technologies: Artificial Intelligence In Image & Sound Creation

We continue our series considering technologies of the near future and how they might transform how we think about broadcast, with a discussion of how the impact of AI on broadcasting may be more about ethical standards than technical standards.

Standards: Part 17 - About AAC Audio Coding

Advanced Audio Coding improves on the MP3 Perceptual Coding solution to achieve higher compression ratios and better playback quality.