The Sponsors Perspective: Unchaining Time

What is real time? While that question doesn’t normally come up at the dinner table, asking it of a group of broadcast engineers can draw out all kinds of responses, from philosophical debates around global atomic clocks to technical dissertations on lines, frames, and permissible nanoseconds of processing delay.


This article was first published as part of Essential Guide: Delivering Timing For Live Cloud Productions - download the complete Essential Guide HERE.

One of the reasons there are lots of opinions on the topic is because time is a human construct that we use for sequencing events. Real time describes a human sense of time that seems immediate.

The perception of real time – what is happening in a specific moment – is heavily influenced by what is happening in a person’s environment when they perceive it. Therefore, the definition of what is real time can vary by individual.

Before we start getting all metaphysical, let’s narrow the discussion. In live media production when we talk about working in real time what we are really asking are two separate questions.

  1. Is there is a noticeable difference between when I perceive something happening and when I can act on it? This is relative latency. A system that feels live to the operator must have a response time of about 240 milliseconds from the time the operator sees the cue to seeing the result of the action they have taken.
  2. Using a 24 hr clock, how many seconds does it take to sequence the different processing steps taken on a frame of video before it is pushed to the viewing audience? This is absolute latency. The expectation for absolute latency varies widely by producer but usually is less than 30 seconds.

The reason to break this into two separate questions is because if all the processing steps involving relative latency can be properly sequenced within the expected absolute latency, it doesn’t matter how many there are or when they occur. The system operators will take their actions in what feels like real time and the audience will have a live viewing experience.

New technology can align contributions from multiple contributors.

New technology can align contributions from multiple contributors.

To see how this works, let’s look at AMPP, Grass Valley’s Agile Media Processing Platform. In AMPP, every video frame is timestamped as it enters the system. Because transport times vary as frames speed across networks to different members of the production team, AMPP also tracks the local time of each operator. This allows creative decisions made by the operator and their associated processing time to be tracked relative to the operator’s time. The result of the operator’s work is time stamped with whatever offset time is best to synchronize the work across the production chain.

With AMPP managing these timing offsets, the operator experiences the phase-aligned environment they are used to. The order and local timing of the decisions are maintained. When all operator actions are sequenced, the total environment is time-shifted relative to the source and thus maintains the program’s continuity.

Following this design strategy, any live production task can be carried out in what feels like real time and assembled in a linear fashion to create programming that exceeds audience expectations. Even with complicated production tasks, total execution time is a few seconds. Compare this with today’s traditional live broadcasts which, in the best of circumstances, still take as much as 50 seconds to get final emission delivery to the home.

Unchaining individual operator workstations from external time is possible because AMPP operates faster than real time using technologies that did not exist when traditional frames per second timing was implemented. Frame syncs that were once used to introduce a few frames of delay are replaced by memory buffers which can hold the frames until they are needed for the sequence.

AMPPs internal frame management allows unique offsets for each operator by adjusting the buffer depth to match the timing offset required for each essence or AMPP can force groups of operators to be synchronized if that timing is critical to their workflow. In either case the perception of the operator is that the system is responding to them in real time.

Dennis Breckenridge, CEO of Elevate Broadcast Pte Ltd described their experience with AMPP in this way:

“With our virtual product we went whole hog. We had no backup plan. We counted on AMPP fully to work and we pushed the boundaries.

“We had contribution from many different countries: Australia, Singapore, the Philippines, Indonesia, and Thailand. Our producer was in Singapore. The director and TD with the switcher were side by side in Sydney, Australia. The main cameras were all in green screen studios with virtual sets but we also had live Zoom feeds and other complications.

“We told the production team: ‘You can’t come to Singapore because of the pandemic. You can stay there and we’re still gonna make everything that you’re used to: Karrera panel, multiviews, comms… All these type of things we’re gonna make magically work for you and you’ll produce a major broadcast for Asia!’ It took a little time to build their confidence and acceptance of that possibility.

Chuck Meyer (left)  and Chris Merrill (right).

Chuck Meyer (left) and Chris Merrill (right).

“Once all the comms and everything came together, the concerns from the production team went away. We managed all the delays through the system. Once that happened, they forgot about the technology and they just moved on with their production. That was the end of it. They felt like they were just in two different control spaces within the same facility. They didn’t think about the fact that they were on different continents.”

AMPP manages both relative and absolute latency in a way that makes the difference invisible to the operator and audience, erasing the barriers that were previously very apparent in remote production.

Supported by

You might also like...

Minimizing OTT Churn Rates Through Viewer Engagement

A D2C streaming service requires an understanding of satisfaction with the service – the quality of it, the ease of use, the style of use – which requires the right technology and a focused information-gathering approach.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Network Orchestration And Monitoring At NAB 2024

Sophisticated IP infrastructure requires software layers to facilitate network & infrastructure planning, orchestration, and monitoring and there will be plenty in this area to see at the 2024 NAB Show.

Audio At NAB 2024

The 2024 NAB Show will see the big names in audio production embrace and help to drive forward the next generation of software centric distributed production workflows and join the ‘cloud’ revolution. Exciting times for broadcast audio.