Broadcast For IT - Part 4 - NTSC Line and Frame Relationships
In this series of articles, we will explain broadcasting for IT engineers. Television is an illusion, there are no moving pictures and todays broadcast formats are heavily dependent on decisions engineers made in the 1930’s and 1940’s. In this article we look at how American NTSC video lines and frames relate to each other, and the consequence for their digital derivatives prevalent throughout the world.
The National Television System Committee (NTSC) was the governing body committed to developing North American broadcast standards. In 1941 they provided the first black-and-white standard and then the second standard in 1953 for color television.
Two Fields One Frame
As described in previous articles, video pictures are made from individual frames, and each frame is made up of many lines. Prior to digital displays, cathode ray tubes (CRT’s) consisted of a stream of electron beams fired towards the front of the screen energizing phosphors to provide brightness. Electromagnetic coils around the CRT deflected the beam so it traced a raster of lines across and down the screen.
One complete vertical and horizontal scan resulted in one frame. When interlace was introduced, each frame was split into two fields. Field-one traced out the odd lines – 1,3,5 etc., and field-two traced out the even lines – 2,4,6 etc. Field-one would be scanned first followed by field-two. Combined, they would form one frame.
Not All Lines Displayed
Broadcasting black-and-white pictures was a straightforward process with 30 frames per second being displayed, and each frame consisting of 525 lines. However, not all the lines are displayed as some were used for frame-flyback, that is the time taken for the electron beam to trace from the bottom of the screen back to the top, a process taking a finite amount of time.
As CRT’s and scan coils were reliant on analog electronics there was some margin of error in representing the image. The size, and where the image appeared on the screen would vary between television manufacturers and individual sets. To help counteract this, overscan was adopted - the picture is slightly bigger than the screen resulting in some cropping of the top and bottom, and left and right sides.
SD is 480i
Flyback-time and overscan further reduces the number of lines displayed, and NTSC television sets produced only 480 visible lines of a total 525. Hence the reason modern standard definition American digital standard definition formats are referred to as 480i, the “i” indicating interlace, even though they are based on 525-line systems.
Although 525 lines are defined, only 480 lines are displayed in NTSC broadcasts and derived systems.
Backwards compatibility is both a major benefit and Achilles Heal of the broadcast industry. Prior to the digital revolution, television sets, with their heavy glass CRT’s were expensive and in the early days temperamental and unreliable.
Maintain Backwards Compatibility
When color standards were being designed, a prerequisite was that color broadcasts must be backwards compatible with existing black-and-white sets. Color information must be combined with black-and-white signals in such a way that both existing black-and-white sets and color sets could decode and show the same broadcast, this was far preferable to transmitting separate color and black-and-white broadcasts.
Output of a color camera is split into two parts, chroma and luma, derived from their red, green and blue sensors. Chroma is the color part of the signal and luma the black-and-white. Chroma was modulated onto the luma in such a way that black-and-white TV’s could ignore it, and color TV’s decode and use it. Exactly how this works is the subject of a later article, however, the color was modulated onto a carrier called the color sub-carrier (CSC).
Stable Pictures Demanded
Maintaining a mathematical relationship between lines and fields is key to broadcasting stable pictures, electronic design becomes easier and consequently cheaper - a major consideration for domestic television manufacturers.
In the black-and-white NTSC system 30 frames consists of 2 fields, and each field consists of 262.5 lines. This relationship could be easily established using a single master oscillator and phased locked loops to derive the line and field rates.
To maintain stability and simplicity of design, the CSC needed to be related to line and field frequencies. Using 30 frames per second allowed for a CSC frequency of 3.898125MHz, this is related to line rate from the ratio (495/2)*15.750KHz. However, when testing using this CSC frequency, engineers found a visual interference and flickering occurred on black-and-white TV’s, caused by sub-harmonics created by the interaction and modulation between the transmission audio carrier and CSC frequency.
Reduce CSC
Further testing demonstrated that by reducing the frequency of the CSC, the flickering disappeared.
Consequently, CSC was reduced by approximately 8% to a ratio of 315MHz/88 = 3.579545455MHz. To derive the line-frequency a ratio of (315MHz/35)/572 was used to give 15.73426573KHz. And to derive the frame rate, the line frequency is divided by 525 to give 29.97002997 frames per second, or 59.94005994 fields per second.
A more convenient way of expressing the field and frame rate is the ratio 60/1.001 and 30/1.001.
Digital Uses 59.94 Fields
Frame and field rates of 60/1.001 and 30/1.001 are still used extensively throughout the USA and rest of the world where broadcast systems are based on NTSC, and in digital systems that are backwards compatible with it. Even modern digital formats use these rates.
More often you will see an American format referred to as 59.94i.
Field rates of 59.94i are incredibly difficult to work with, they don’t easily convert to European rates of 50 fields per second as there is no common integer, and counting frames in seconds is even more challenging as there is not an integer number of frames in a minute.
Incompatible Frame Rates Cause Disturbance
Modern digital transcoders, standards converters and camera’s often have a setting for 60 fields a second, which is not 59.94 fields per second. Trying to convert between the two will result in jittery, difficult to watch pictures with random frame loss and disturbance.
Great care must be taken when configuring systems that operate with 59.94i otherwise picture disturbance will be abundant. In the next article we will look at time representation in 59.94i.
You might also like...
Microphones: Part 3 - Human Auditory System
To get the best out of a microphone it is important to understand how it differs from the human ear.
HDR Picture Fundamentals: Camera Technology
Understanding the terminology and technical theory of camera sensors & lenses is a key element of specifying systems to meet the consumer desire for High Dynamic Range.
IP Security For Broadcasters: Part 2 - The Problem To Be Solved
By assuming that IP must be made secure, we run the risk of missing a more fundamental question that is often overlooked: why is IP so insecure?
Standards: Part 22 - Inside AIFF Files
Compared with other popular standards in use, AIFF is ancient. The core functionality was stabilized over 30 years ago and remains unchanged.
IP Security For Broadcasters: Part 1 - Psychology Of Security
As engineers and technologists, it’s easy to become bogged down in the technical solutions that maintain high levels of computer security, but the first port of call in designing any secure system should be to consider the user and t…