Timing: Part 9 - Time Base Correction

Time base correction is an enabling technology that crops up everywhere; not just in broadcasting.

Whereas generic data, such as a text file or a bank statement, don't have to be delivered at any specific speed, most audiovisual information has an implicit time base that determines the rate at which the data should be presented. Effects excepted, the rate needs to be constant and equal to the original rate.

Most transmission and recording media cannot meet that requirement and that is where time base correction comes in. Time base correction takes place just before presentation, so that any timing errors introduced before that, whether accidental or deliberate, can be removed.

One of the first applications of time base correction was in analog video recording. The first successful video recorders used rotary heads and the impact of head contact combined with flexible tape resulted in vibrations in the tape that varied the effective speed of the head with respect to the tape. Such time base correction was originally done in the analog domain.

The off-tape signal was passed through a delay line consisting of inductors and capacitors. The capacitors were varactor diodes. A semiconductor diode that is reverse biased acts like a capacitor as it builds up charge across the non-conducting junction. The bias voltage changes the charge and the capacitance. All diodes do it but varactor diodes are optimized for use in that way.

Analog video signals have the advantage that they contain timing information in the shape of sync pulses occurring every TV line. By comparing the off-tape sync pulses with those from an accurate reference, it was possible to measure the time base error and derive the waveform to feed the varactor diodes to cancel the time base error. Analog audio waveforms do not contain any embedded timing references and time base correction is only possible when it is manually controlled for restoration purposes.

As it is not possible to advance a signal, the time base corrector always causes delay. The recording to be corrected is replayed slightly ahead of when it is needed. A fixed delay will cancel that out. A shorter delay will have the effect of advancing the signal; a longer delay will make it emerge later.

Varactor time base correctors were adequate for quadruplex video recorders, but when helical scanning was developed, the longer tracks, holding an entire field, suffered greater time base error and the digital TBC was developed. The digital TBC converted the off-tape signal to binary data that could be delayed by storing it in memory. Unlike analog delay, which is difficult to achieve, delaying data is trivially easy.

Fig.1 - When addressed by an overflowing counter, a memory takes the form of a ring. The read and write addresses are nominally one half turn apart and advance at the same average speed

Fig.1 - When addressed by an overflowing counter, a memory takes the form of a ring. The read and write addresses are nominally one half turn apart and advance at the same average speed

Helical scan VTRs could also display a picture over a moderate range of speeds. Professional VTRs used heads that moved along the scanner axis in order to follow tracks at the wrong speed. The head to tape speed also became incorrect and the time base corrector would put it right. The output field rate was maintained the same as the station reference because a field would be repeated if the tape was too slow, or omitted if the tape was too fast.

In composite formats, such as the C-format, variable speed also destroyed the field sequences of the composite signal, which had to be decoded and re-encoded with respect to station reference.

Fig. 1 shows one approach to the time base corrector, which is to address a memory using a counter. If the counter has a finite word length, it will overflow back to zero after the maximum count is reached. This has the effect of making the memory into a ring. Data written into the memory will be available for one rotation and then it will be overwritten.

Typically the write clock varies in speed with the instability of the data, whereas the read clock runs at constant speed derived from an accurate reference.

The greatest range of correction will exist when the average write address is opposite the read address, as then the data can be advanced and delayed by an equal amount. As data will be lost if the delay range is exceeded, and the read data clock must have a constant rate, it is the writing of the data that must be controlled to center the delay. The difference between the two addresses reveals the state of memory centering.

A slightly different situation exists in computers where the data from a hard drive reaches the disk controller at a steady speed, but the demands on the computer's bus due to other processes mean that the data have to be buffered. Some systems use a silo, which is a kind of shift register where data are put in at the top and ripple down to rest on any previous data. Data are read from the bottom and as a word is removed all the data ripple down.

In the case of a storage medium such as an optical disk or a tape, the physical speed of the medium can be controlled to center the memory. In the case of a CD player, the spindle may be speeded up or slowed down as necessary. Another possibility is to run the disk slightly too fast and to slow down the data by skipping tracks backwards so some data are repeated and discarded instead of new data being played.

In portable and automotive CD players, mechanical shocks could knock the pickup off track and there would be an interruption to the data until the player put the pickup back on the correct track again. With a big enough memory, the interruption due to a shock could be considered a time base error and the audio would continue to play from memory as the transport re-positioned the pickup.

When working in conjunction with a data network or a file server, the state of TBC or silo centering would affect the rate at which new data blocks were requested from the system.

When there is only one user, the user can determine the rate at which data are supplied. However, when there are multiple users, as with digital television broadcasts, this approach cannot work. Instead the broadcaster supplies data at a fixed rate, and the users each have to synthesize a TBC read clock running at that rate. Any error in the frequency of that clock or poor centering will result in the memory over- or under-flowing.

The re-creation of these clocks in digital television and the means for obtaining lip-sync is a topic in itself that will have to wait for another time.

Fig.2 - One approach to time compression requires two memories that are written alternately. When full they are read at a higher clock rate, which opens spaces between the data blocks.

Fig.2 - One approach to time compression requires two memories that are written alternately. When full they are read at a higher clock rate, which opens spaces between the data blocks.

In a digital system, the timing accuracy is determined by the precision of the clock controlling the ADC. If any subsequent time base corrector outputs data with the same accuracy, the timing error of the system, or what in audio used to be called wow and flutter is substantially zero, and lower than could be achieved with any analog system.

Once time base correction became available, it allowed systems that deliberately introduced time base error. A continuous, unbroken stream of data, such as from an audio ADC, is difficult to record in that form. Practical recordings need entry points at various places to allow editing and random access. Error correction requires the addition of check data.

All real digital recorders carry out time compression on the data to be recorded. Fig.2 shows one way in which it can be done. A memory is written at the natural rate of the incoming data. When it is full, the incoming data are routed to a second memory, while the first is read at a higher rate. This has the effect of telescoping the time needed to read and opens up gaps between the data blocks, into which can be inserted synchronizing patterns, addresses and redundancy.

The Betacam format produced by Sony for professional use had an interesting application of time compression. Betacam was an analog component format and the bandwidth of the two color difference signals is a lot less than the luma bandwidth. Recording two bandwidths on the same tape is not easy to do in an efficient manner. Sony's solution was to use time compression.

The color difference signals were time compressed by a factor of two so they would both fit in a single track, allowing the Betacam to use only two tracks to record three signals. A suitable time base corrector sorted it out on replay.

A substantial time compression factor allows many different data streams to be multiplexed into the same channel. A given stream of data is time compressed into packets that have the same code so they will all be sent to the same destination. The packets also have a count, that increments at each new packet, so a de-multiplexer can re-assemble the original data stream. Telephony has used the idea for years, and MPEG Transport streams use the same approach. 

You might also like...

KVM & Multiviewer Systems At NAB 2024

We take a look at what to expect in the world of KVM & Multiviewer systems at the 2024 NAB Show. Expect plenty of innovation in KVM over IP and systems that facilitate remote production, distributed teams and cloud integration.

NAB Show 2024 BEIT Sessions Part 2: New Broadcast Technologies

The most tightly focused and fresh technical information for TV engineers at the NAB Show will be analyzed, discussed, and explained during the four days of BEIT sessions. It’s the best opportunity on Earth to learn from and question i…

Standards: Part 6 - About The ISO 14496 – MPEG-4 Standard

This article describes the various parts of the MPEG-4 standard and discusses how it is much more than a video codec. MPEG-4 describes a sophisticated interactive multimedia platform for deployment on digital TV and the Internet.

The Big Guide To OTT: Part 9 - Quality Of Experience (QoE)

Part 9 of The Big Guide To OTT features a pair of in-depth articles which discuss how a data driven understanding of the consumer experience is vital and how poor quality streaming loses viewers.

Chris Brown Discusses The Themes Of The 2024 NAB Show

The Broadcast Bridge sat down with Chris Brown, executive vice president and managing director, NAB Global Connections and Events to discuss this year’s gathering April 13-17 (show floor open April 14-17) and how the industry looks to the show e…