Interlace: Part 1 - It Began In The 1930’s

At one time broadcast television was inseparable from interlace, whereas now things have moved on, but not without interlace leaving a legacy. So what was it all about?

I suppose the story starts with a bit of information theory. Television began by using scanning. Light from the scene fell on a target to liberate electrons and the two-dimensional pattern of electrons was scanned by an electron beam that moved in series of parallel lines. That’s all very simple. It is also simple to grasp that increasing the resolution of the picture requires two things.

Firstly, the lines have to be closer together to improve the vertical resolution. For a given frame rate, the line rate goes up and the duration of each line gets shorter. Secondly the video signal has to pick up more horizontal detail but has less time in which to do it. The result is that the bandwidth needed goes as the square of the resolution.

The scanning process would proceed across the scene until all of the lines had been sent and then begin again. The video signal was displayed on a cathode ray tube and a given point on the picture was refreshed once per scan. This meant that the picture flickered at the picture rate. To minimize flicker visibility, the picture rate was chosen to be 60Hz in USA and 50Hz in UK, corresponding to the public ac power frequencies which just happened to be a little above the critical flicker frequency (CFF) of human vision.

Given that the picture rate should not fall below the CFF, the number of lines in the picture determines the bandwidth needed. At the time (1930s) it was proposed to have several hundred lines in the picture, to give acceptable performance, and of course the bandwidth needed was huge and amplitude modulation doubled that because of the upper and lower sidebands needed by AM. That was attacked using vestigial sideband transmission, where the lower sideband was barely there and the channel bandwidth was reduced, but there was still a problem.

Enter interlace, which is essentially a bandwidth reduction scheme that in most implementations reduces the bandwidth by one half. Before trying to define or justify interlace, it is probably best simply to say what it is. Imagine a picture containing numbered scanning lines. In progressive scanning, the lines are sent one after the other until the whole picture has been scanned. Using interlace, the odd numbered lines are scanned first, so that every other line is sent. Then the even numbered lines are sent.

Some new terminology is needed. The whole picture is a frame and it is scanned into two fields, odd and even. Fig.1 shows how it was done in practice. Scanning a CRT required two saw-tooth waveforms, one at line rate and one at frame rate. In interlaced scan, there must be an odd number of lines in the frame and the slower of the saw-tooth waveforms is now at the field rate. Here we see the main advantage of interlace, which is that it is trivially easy to implement. All that is necessary is to change the time constants of the sawtooth generators. The cost was negligible at both camera and display, which was important when everything had to be done with vacuum tubes. 

Fig.1- Interlace is simply achieved by messing with the scanning saw-tooth waveforms so that a frame having an odd number of lines is broken into two fields.

Fig.1- Interlace is simply achieved by messing with the scanning saw-tooth waveforms so that a frame having an odd number of lines is broken into two fields.

The electron beam in the display now very nearly revisits the same point on the screen once per field. The intended effect of interlace is that the flicker rate should be set by the field rate, which means that the frame rate can be half of that, either 25 or 30Hz, thereby halving the bandwidth needed for a given resolution. One definition of what interlace is supposed to do is to double the flicker rate for a picture with a given number of lines. Another way of looking at it is that for a given flicker visibility, it is supposed to double the vertical resolution.

Those are the things that interlace is supposed to do. But did it ever do them? Well, of course it didn’t. Interlace is too much like the mythical free lunch. For an expenditure of practically nothing, interlace is going to halve the bandwidth needed with no downside?

Of course it can’t and one of the difficulties was the introduction of a whole slew of visible problems. Sadly, in the course of resolving the worst of those problems, the advantages of interlace got lost along the way and all that survived were the claims.

However, let’s be fair and see if there is a benefit somewhere along the line. Consider the vertical/temporal spectrum of progressive scan. The base band is, of course, rectangular, just the same as the scanning lines appear when viewed edge on with time advancing to the right. In most TV cameras, the image is integrated for a significant part of the picture period, which means that moving detail is lost. The top right-hand corner of the spectrum doesn’t contain much information.

Fig.2 shows the situation with interlace. The view of the lines edge-on now reveals a quincunx pattern (like the five spots on a die) and, not surprisingly, the theoretical spatio/temporal base band is now triangular, because interlace cut the information capacity in half. However, the part of the base band that is cut off is the part that contains little information, so we might consider interlace to be a kind of a source coder that filters the useful from the less so.

That would be true if we properly implemented triangular filters at the camera and display, but with 1930’s vacuum tube technology that was impossible, so the benefit was only ever academic.

Figure 2 - The vertical temporal spectrum of interlace is triangular, but the necessary filters were never implemented.

Figure 2 - The vertical temporal spectrum of interlace is triangular, but the necessary filters were never implemented.

Another way of looking at interlace is to consider it as a compression technique. Considering it cuts bandwidth, it’s a possibility. However, with interlace there is no decoder. The information sent is one half of the original and there is no way of getting the other half back. The only way in which interlace behaves like a compression technique is that it compromises the performance of other compression techniques such as MPEG.

Fig.3 shows the sub-optimal zig-zag scan required in MPEG to order DCT coefficients for an interlaced field. This was announced about the same time as Boris Yeltsin became the Russian leader. As he was quite fond of the vodka I named it the Yeltsin walk.

One of the assumptions made about interlace is that the flicker frequency is the field rate. Unfortunately, it’s not true. The electron beam doesn’t quite revisit the same place on the screen in each field. It is one line out and in the presence of vertical detail in the picture, adjacent lines in the two fields could have different brightness and that would be reproduced at frame rate, not field rate, 30Hz in USA.

Fig.3 - The zig-zag scan used in MPEG to sequence coefficients in interlaced fields is a big mess.

Fig.3 - The zig-zag scan used in MPEG to sequence coefficients in interlaced fields is a big mess.

The late William Schreiber of MIT was no big fan of interlace and he had a perfectly legal NTSC test signal that generated maximum vertical resolution, with alternate lines in the frame being black and white. The result with interlace was that alternate fields were totally black and totally white, flickering at 30Hz. It was impossible to watch it without getting a headache or worse.

Another striking finding I remember was when a comparison was made between progressive scan and interlaced scan on two identical 24 inch CRT monitors with an identical line count and frame rate. This was in a large conference hall and I must have been around 100 feet away, a distance at which I could say nothing about resolution. At that distance it was obvious which monitor was showing the interlaced picture, because of the 30Hz flicker content. The progressively scanned monitor didn’t flicker.

Camera manufacturers were conscious that they would get the blame if their pictures flickered, so interlaced cameras almost all contained vertical filters to stop it. There were three results. Firstly, the vertical resolution promised by interlace was not obtained. The vertical resolution was much the same as if the same line rate had been used in a progressive system. Secondly, the users of such systems formed a completely incorrect view of the relationship between the number of lines in a TV standard and the results. However, flicker was effectively suppressed on still pictures because the picture rate was essentially the field rate.

Unfortunately, motion before the camera causes differences to appear between successive fields and that opened up another can of worms, because the differences were indistinguishable from those due to vertical detail and appeared at frame rate. The damage to motion portrayal was considerable and had to be dealt with by a fundamental change which will be considered in Part 2.

You might also like...

Essential Guide: Network Observability

This Essential Guide introduces and explores the concept of Network Observability. For any broadcast engineering team using IP networks and cloud ecosystems for live video production, it is an approach which could help combat a number of the inherent challenges…

Ensuring Live Streaming Achieves Broadcast Grade

Broadcast service providers delivering live production, contribution, playout and transmission services have observed the continuous and accelerating movement towards OTT services.

Distributing Content Over The Internet With RIST Continues To Improve

The Broadcast Bridge talks to Dr Ciro Noronha about the latest RIST release and where it sits in the ongoing RIST roadmap.

Standards: Part 4 - Standards For Media Container Files

This article describes the various codecs in common use and their symbiotic relationship to the media container files which are essential when it comes to packaging the resulting content for storage or delivery.

Standards: Appendix E - File Extensions Vs. Container Formats

This list of file container formats and their extensions is not exhaustive but it does describe the important ones whose standards are in everyday use in a broadcasting environment.