SMPTE conducted a webcast to review the history of some television standards.
It pays to have a handle on your industry’s history. Today, few who write modern video software know that over $70 billion a year is generated in global broadcasting based on a non standards-based engineering decision made in 1953 when black and white television transitioned to color.
This fascinating decision and its more than 60 year aftermath is still causing much confusion and frustration for modern engineers. It was the subject of a recent SMPTE webcast on fractional frame rates and drop frame time code presented by John Pallett, director of product marketing at Telestream, and Bruce Devlin, media technologist for Dalet.
Pallett called it “a misunderstood topic,” while Devlin walked the audience through a minefield of problems that has occurred due to mixed frame rates and timecode anomalies throughout the years.
John Pallett, director of product marketing at Telestream.
“The way we got here is fascinating,” said Pallett. “Black and white television originally ran at 30 frames per second. It was structured in a way that the luminance — the video — was in an AM channel and the audio was in an FM channel and there was a fixed frequency distance between those two carriers of 4.5 MHz.
“When color became part of the equation, a third band of color information was needed,” he continued. “When the question came up of how to add that color carrier frequency, the challenge was to do it without causing artifacts in either the existing video or the audio. It could not break the backward compatibility of black and white television.
“The engineers realized that the color carrier frequency needed to be an odd harmonic of half line frequency. So they did some math. They found the best way of solving this problem was to slightly change the horizontal line rate of the luminance. So the AM was changed from 30 frames per second to 29.97. What that allowed was the insertion of a color carrier band without causing any distortions to the video and audio.”
Because older NTSC black and white TV sets had very generous tolerances in the 1950s, the change was backwards compatible and it allowed home viewers to watch programming on both new color sets and older black and white models for the years ahead.
The chart illustrates multiple frame-rates and how the industry must "add", "drop", "interpolate", ect to move between them.
What wasn’t seen at the time was how that decision would carry forward to affect frame rates and time code in the future. “There are two key reasons why timecode is so important,” said Pallett. “One is frame accuracy so you can correctly identify which frame you want to use for either alignment or editing and the second is media duration.”
To compensate, drop frame time code was established. A simple analogy is it works like leap years. If video runs at 29.97 frames per second, you can’t count 30 frames for every second. If you do, the timing will be off 3.6 seconds an hour or about two minutes each day. Since broadcast systems can’t operate with that level of inaccuracy, the fix was to skip two frames every minute except on multiples of ten minutes — a bit like leap years. The method fixed the problem.
“But, said Palette, the message here is when you mix frame rates and you mix time code systems, you can generate both operational and system problems.”
Devlin went through a series of war stories citing a myriad of problems from major and minor broadcasters. The lesson: “Don’t expect to be able to buy a magic box that will make all the pictures right,” he said. “It’s actually the operational and systems problems that no one single box can fix.
Bruce Devlin, media technologist for Dalet
“It’s good working practices and having an understanding of what you are doing that’s going to make mixing different frame rates work. One person I interviewed for this presentation said the best way to mix frame rates is not to do it at all,” Devlin said. “Fair enough, I can’t argue with that.”
Working with Devoncroft Partners, a market research firm, the presenters found that more than $70 billion a year in broadcast revenue depends on drop-frame timecode. It would cost more than $50 billion to switch to an entirely 30 frame per second system from 29.97.
“A lot of money is riding on drop frame time code distribution,” Pallett said, noting the huge cost of switching to integer frame rates and away from fractional frame rates. “Since it’s not going to happen, we need to focus on how to make fractional time code work.”
Since there is no official standard for 29.97, different systems work in different ways. This complicates the situation. “With the coming UHDTV, a drop frame standard for a high frame rate fractional would help,” said Devlin. “In an ideal world, we would have a standard.”
SMPTE is working on a standard specifying a time address for high frame rate signals and its data structure in the ancillary data space. The document will specify rates of 72, 96, 100, 120 and 120/1.001 frames per second and it will be extensible to cover rates of up to 960 frames per second.
SMPTE is an internationally recognized standards development body.
Another improvement would to always know a program’s time base, which is not always possible. This is especially important in IMF (Interoperable Master Formats). IMF formats may contain edit points and they may have multiple streams that need to be aligned. They will have content duration, but getting accurate duration for a complex EDL assets can be difficult.
“Avoid ambiguity,” Pallett said. “IMF needs to have accuracy. The media needs to be annotated correctly and every tool in IMF needs to be correct in order to make everything work.”
Problems, he said, normally go away when systems that don’t talk with each other are aligned and when teams are given flexibility when working with material with different times bases.
When timecode and frame rates go wrong....well you get the "picture"...or not.
Incompatibility issues tend to occur where drop frame and non-drop frames intersect and when systems are not configured correctly. This is compounded if you don’t know the time base of your media, can’t control the process and have no standards to follow.
A decision made in 1953 — in a much simpler time — has led to the confusion that still haunts the video industry today. It will take time, education and standards to work it out.
SMPTE Educational Webcast Seminars are supported by AJA Video Systems, Blackmagic Design, Brightcove, Ensemble Designs and Telestream.
You might also like...
Quantum Computing is still a developmental technology but it has the potential to completely transform more or less everything we currently assume regarding what computers can and can’t do - when it hits the mainstream what will it do…
At the heart of virtually every IP infrastructure and its inherent IT network is a software layer that acts like a conductor to make sure the system is working smoothly. Some call it the orchestration layer because it instructs each…
From capture, through production and onwards to delivery, handling multiple formats simultaneously is a core challenge for broadcast workflows. Thankfully there will be plenty of technology options on show at IBC to facilitate even the most complex requirements.
This is the third of a multi-part series exploring the science and practical applications of RF technology in broadcast. Here we focus on things to consider when planning systems, how to tune transmitters and monitoring requirements.
The reality of the adoption of IP based workflows is that most broadcasters are running hybrid infrastructure that combines SDI, IP and IP enabled cloud and there will be plenty at IBC to help smooth the hybrid transition.