VSF, SMPTE, EBU Publish Minimum Viable System Requirements Report

The Video Services Forum (VSF), the Society of Motion Picture and Television Engineers (SMPTE), and the European Broadcasting Union (EBU), today approved the publication of the Minimum Viable System Requirements report. This report details the minimum requirements for a live multi-camera studio production system, using packetized network technology.

The operational scenario addressed in the Minimum Viable System (MVS) report is the transport of live media within the broadcast plant to support a multi-camera, live studio production; specifically a live, multi-camera sports halftime show.Thomas Edwards of Fox Network Operations and Engineering said, “We chose the live sports scenario because we believe it will be one of the most challenging areas for professional Video over IP”.“If we get this right, we believe other scenarios will also be achievable.”

Among key requirements listed in the MVS is the requirement to carry video payload of any resolution up to the size of UHDTV2 (7680 x 4320), and a requirement to carry elementary essence types (e.g. video, audio, ancillary data) as separate flows.However, the MVS also recognizes the requirement to support SDI, stating that the solution should be capable of providing “transparent transport” of SDI payload bit streams over the network.

The report, which is freely available to the public, represents a concentrated effort on the part of manufacturers, users and service providers to move the industry closer to the day when IT technology is at the core of professional media facilities.Participants from all over the world met several times both in the United States and in Europe to discuss the requirements for the Minimum Viable System.

Chuck Meyer, CTO of Grass Valley said, “The MVS report points the way to what will become a very important infrastructure shift for media facilities in the future”.

You might also like...

Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs

Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.

Future Technologies: Artificial Intelligence In Image & Sound Creation

We continue our series considering technologies of the near future and how they might transform how we think about broadcast, with a discussion of how the impact of AI on broadcasting may be more about ethical standards than technical standards.

Standards: Part 17 - About AAC Audio Coding

Advanced Audio Coding improves on the MP3 Perceptual Coding solution to achieve higher compression ratios and better playback quality.