Interlace: Part 2 - Vertical Resolution

The human eye is not fixed and so it can track moving objects in real life and on screens. The tracking action changes everything. Without an understanding of tracking everything seems peculiar. With an understanding it seems obvious why certain things don’t work. Interlace is one of them.

When the eye tracks a moving object, it tries to keep the object stationary on the retina so it can be seen without temporal smear. In real life the idea works very well. When watching displays it works less well, because the associated camera and display may be adding temporal smear to the moving images.

Once the eye moves to track something, it is no longer working along the time axis. Instead it is working along an axis of optic flow and in general this will be inclined to the time axis.

Fig.1a) shows the scanning lines of an interlaced format as if seen from the edge of the screen, with time advancing to the right. If nothing moves, the odd and even fields of interlace combine to create a frame containing all of the picture lines and everything works fine. However, Fig.1b) shows the situation with slow motion of one line per field period.

Now there is a problem because the interlacing is no longer working. Instead the sampling rate, the Nyquist frequency and the resolution have been halved and any frequency above the lowered Nyquist frequency will alias. The Kell factor will have been significantly reduced.

Double the motion speed and the lines will interlace again. At two-and-a-half times as fast, interlace will fail again. The system acts like a comb filter, repetitively halving the vertical resolution.

The difficulty is worst for motion in the direction of the interlacing. Early experimental “high definition” television systems between the wars used an aspect ratio of about 5:3, which meant that the vertical scanning amplitude was smaller. It made sense to use the smaller amplitude for the higher scanning frequency, so the picture was scanned into vertical columns and the picture rate scan was horizontal.

Once interlace was tried with such a system the results were terrible because in real pictures, most of the motion most of the time is horizontal. Things move across the surface of the Earth more readily than they move up and down. One of the longest lasting legacies of interlace is that it forced scanned television formats to use horizontal scanning in lines. This puts the majority of motion in real pictures at right angles to the direction in which interlace fails.

Fig.1 a) In the absence of motion, lines from successive fields create a complete frame b) with certain vertical motion, the lines in successive fields no longer interlace and resolution is halved c) progressive scan without motion achieves design resolution d) progressive scan with certain motion becomes interlaced and resolution rises!

Fig.1 a) In the absence of motion, lines from successive fields create a complete frame b) with certain vertical motion, the lines in successive fields no longer interlace and resolution is halved c) progressive scan without motion achieves design resolution d) progressive scan with certain motion becomes interlaced and resolution rises!

A notable exception to vertical motion is the rolling credits at the end of a movie. Those of us who are old enough to remember renting movies on VHS tape will remember that they were practically illegible.

As has been seen, another problem of interlace is that high vertical resolution is accompanied by frame rate (25 or 30Hz) flicker. The vertical filters that manufacturers put in to ameliorate the flicker also served to ameliorate the vertical aliasing when interlace failed due to vertical motion.

However, the result of using vertical filtering to prevent flicker and aliasing was that the vertical resolution of an interlaced picture in real life was determined by the number of lines in a field, not the number of lines in a frame. The unfortunate truth of interlace was that in practice it damaged the picture in direct proportion to the amount of bandwidth reduction. To be blunt, it didn’t work: no banana, no free lunch. If you throw away half of the information with no means to recover it, what else do you expect?

The eye seems to judge the worst aspect of a picture, so it doesn’t matter how good the horizontal resolution is; the quality will be set by the poor vertical resolution. This was unfortunate as most of the tests that were applied to television signals tested horizontal resolution only and those that did test vertical resolution were static test cards and therefore meaningless.

A moment’s thought will suggest that motion must have an effect on progressively scanned pictures too and to make a fair comparison we ought to look at that.

Fig1c) shows a static progressively scanned picture observed along the time axis and all is well. Fig.1d) shows slow vertical motion has effectively doubled the number of picture lines. Instead of making things worse as interlace does, progressive scan makes things better. Once more, motion acts like a comb filter, repetitively doubling the vertical resolution.

Fig.1d) also allows us to look askance at the whole subject. There appears to be no difference between the results of a static interlaced picture in Fig.1a) and a progressively scanned picture with a certain constant vertical motion. In both, lines from two pictures interlace together.

That being so, what, then is the difference between progressive and interlaced scan? The answer is very simple. Instead of counting the number of lines in the frame in an interlaced standard and being disappointed that they practically halve in the presence of vertical motion, we should count the number of lines in a field and give thanks that they double at certain vertical velocities, including zero. In other words the greatest problem with interlace is that the way the scanning standard is described causes the performance to be over-specified by a factor of two.

The line count specified in an interlaced scanning standard should be the number of lines in the field, not in the frame.

The other major difference is that no progressive scan format would use a picture rate as low as 25 or 30 Hz, so progressive scan pictures don’t flicker.

The real problem is common to all technologies, which is that progress is made by trial and error in the early stages because the mechanisms involved are not understood. The Wright Brothers had no aerodynamics text books to read and had to resort to guesswork a lot of the time. The Wright Flyer was just this side of a death trap as those who tried to fly replicas found out.

So it was with television, where concepts such as eye tracking, optic flow axes and dynamic resolution were simply unknown in the beginning. My own awareness of these things began in 1980, when I visited to United States for the first time. I remember to this day turning on an American TV set that had “only” 525 lines expecting the pictures to be softer than the “superior” 625 lines of British TV. I stared in disbelief, because they weren’t. If anything the pictures were better and it took me quite a while to figure out why.

The first book I saw on the subject was William Schreiber’s work of 1991 which refers to “the disappearance of half the scan lines during vertical motion” when using interlace and concludes that “its use in new TV systems does not appear to be advisable”. This was based on an SMPTE paper he gave in 1984. All of the formats we now call standard definition, NTSC, PAL and SECAM etc, were designed well before that and inevitably were sub-optimal.

NTSC was sub-optimal because the actual vertical resolution achieved was about half what it should have been and was therefore significantly inferior to the horizontal resolution, which was wasted. In the light of modern understanding, a better result would have been obtained within the same bandwidth using 60Hz frame rate with progressive scanning and about 350 lines per frame. The horizontal resolution would be reduced slightly to match the increased vertical resolution.

Designers of interlaced formats after that time have some explaining to do. Of these, the worst culprit was the standard put forward by the so-called Advanced Television Systems Committee with the goal of bringing high-definition TV to American broadcasting. Schreiber’s views from MIT were well known by that time and the IT industry had studied interlace at some length and given it the thumbs down. The Department of Defense had also considered interlace in the light of military applications of video and had also concluded that it had no future in military thinking. The DoD wrote formally to the ATSC to say as much.

Given that they included interlaced scanning standards and given what was known and what they had been told about interlace at that time, I think it is fair to say that the use of the term “advanced” by the committee was questionable.

Part of the trouble was that certain manufacturers outside the USA had heavy investment in interlaced equipment and wanted to sell it. The reader will have to imagine how they went about that. The other part of the trouble was that broadcasters could see the writing on the wall which said that digital video was just another form of data and that the IT industry would make inroads into broadcasting. Some of the older characters, who had broadcast the arrival of the Mayflower in monochrome, thought that interlace somehow distinguished television from computer-based images. In a sense they were right: interlace made images worse.

You might also like...

Brazil Adopts ATSC 3.0 For NextGen TV Physical Layer

The decision by Brazil’s SBTVD Forum to recommend ATSC 3.0 as the physical layer of its TV 3.0 standard after field testing is a particular blow to Japan’s ISDB-T, because that was the incumbent digital terrestrial platform in the country. C…

Designing IP Broadcast Systems: System Monitoring

Monitoring is at the core of any broadcast facility, but as IP continues to play a more important role, the need to progress beyond video and audio signal monitoring is becoming increasingly important.

Broadcasting Innovations At Paris 2024 Olympic Games

France Télévisions was the standout video service performer at the 2024 Paris Summer Olympics, with a collection of technical deployments that secured the EBU’s Excellence in Media Award for innovations enabled by application of cloud-based IP production.

Standards: Part 18 - High Efficiency And Other Advanced Audio Codecs

Our series on Standards moves on to discussion of advancements in AAC coding, alternative coders for special case scenarios, and their management within a consistent framework.

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.