Here we dip a toe into spectrum analysis. The water’s warm.
Spectrum analyzers are useful devices that are found in many walks of life. Originally using analog technology, today they are almost all digital in operation. It doesn't really make any difference because a digitally represented waveform that has correctly been sampled contains all the information the original was meant to have.
The purpose of a spectrum analyzer is to produce a graph of energy with respect to frequency. There are various ways of doing that. If the spectrum to be considered has stationary statistics, in other words if the signal is sustained, there is plenty of time to look at it and that allows more options than if it is transient.
The obvious way of looking at a spectrum is to use a narrow band filter whose center frequency can be swept across the range of frequencies of interest. The level coming out of the filter is used to draw the graph. However obvious the method, making such a filter is unfortunately somewhere between very difficult and impossible and various alternatives have had to be found.
A similar problem is faced by a radio receiver, which has to select one narrow band of frequencies from the wide range of frequencies used for transmission. The most ingenious solution, which dates back to the work of Fessenden around 1901, was to modify the frequencies to be analyzed so that a fixed filter could be used.
Fessenden found that if two sine waves were multiplied together, the result would be a pair of frequencies, one the sum and one the difference. He called the process heterodyning. Before vacuum tubes, way before transistors, before rock 'n roll, radio transmission consisted of pulsing a carrier wave with Morse code. The problem was how to detect that radio frequency wave. The received signal was heterodyned with a locally generated signal that was sufficiently close in frequency that the lower sideband fell in the audible range. The transmitted carrier could then be heard as a tone.
In the next step, as is so often the case, the same invention was made at about the same time in two places. Lucien Levy in France and Edwin Armstrong in USA described a receiver in which there was a fixed filter, now called an intermediate frequency filter. The signal to be received was heterodyned with a local oscillator whose frequency was adjusted until one of the sidebands fell within the pass band of the fixed filter.
Before Chuck Yeager's dramatic rocket-powered flight, the word "supersonic" meant beyond the range of human hearing, so the receivers of Levy and Armstrong were called supersonic heterodyne sets. This was later contracted to superhet. The superhet technique became universal for radio receivers and is still in use today. Frequencies beyond human hearing became known as ultrasonic, but the term superhet continued in use.
Fig.1 shows how the principle can be used to make a spectrum analyzer. A variable frequency oscillator is provided, and the output of this is heterodyned, or mixed, with the signal to be analyzed before being passed to the fixed filter.
Fig.1 - Instead of using a complex filter whose frequency can be swept, the input signal is heterodyned with the output of an oscillator so that its own spectrum is swept with respect to a fixed filter.
It is important to realize that in broadcasting, mixing has two completely different meanings. In an audio mixer, the process of mixing is actually addition and when two signals are mixed, no new frequencies are intended and indeed every effort is made to prevent new frequencies, because they would be heard as distortion. The other type of mixing is multiplication, which is used in heterodyning deliberately to create new frequencies.
In practice it is not always necessary formally to multiply the two signals. It is often sufficient to feed them both into a circuit that has an element of non-linearity. That non-linearity then causes the two signals to inter-modulate and the sum and difference signals are created in that way.
Most electronic devices are fundamentally non-linear and for many purposes have to be linearized by the use of techniques such as negative feedback. Straightforward omission of linearizing techniques will usually produce an acceptable mixer.
If the frequency of the local oscillator is changed, the part of the spectrum of the input that can pass through the fixed filter will also change. Sweeping the oscillator frequency is every bit as good as sweeping the filter frequency, and a whole lot easier to implement.
Multiplication is simply another word for amplitude modulation, where the amplitude of one waveform is controlled by the voltage of another. Sampling, as used in digital audio and video, is also an amplitude modulation process. In that case the analog audio or video signal is used to amplitude modulate a pulse train at the sampling frequency in order to create a pulse amplitude modulated (PAM) signal.
In digital broadcast signals the sampling process is there to allow a time-continuous waveform to be made time discrete so that each sample can be quantized. The resultant sampling spectrum is not so much a goal as a consequence that has to be accepted.
The flashing light of a stroboscope and the shutter of a movie camera are also performing sampling processes and can acts as a kind of optical spectrum analyzer, either deliberately or inadvertently. Sometimes we construct a stroboscope accidentally by using optical sampling equipment such as film and video cameras, which have picture rates. When something appears before such a camera that contains changes at a similar rate, odd things happen.
Probably the most well known of these is the spoked wagon wheel that rolled by in countless cowboy movies. At an appropriate speed, the spoke passing frequency would equal the frame rate and the wheel would appear to stop. Similar things would happen to the airscrews on light planes and to helicopter blades.
Fig.2 shows what is happening in the frequency domain. The input is the spoke passing frequency and it is being sampled, or heterodyned, by the camera frame rate. The upper side band is created at twice that frequency and the lower side band is created at or near zero Hertz.
Fig.2 - In the stroboscope, the lower sideband of the sampling spectrum is brought to zero Hertz, so that it can pass the low-pass filtering characteristic of human vision and give the illusion of arrested motion.
In this case the analysis filter is the human visual system which cannot pass the upper sideband and which sees only the lower sideband. The human visual system forms a temporal low pass filter, and therefore forms a better analogy to the spectrum analyzer than the superhet receiver, which uses a band pass filter. Another consideration is that the low pass filter is easier to make, since the filtering action is mathematically much the same as averaging.
Having looked at the principles of spectral manipulation and various examples, we can now look at the Fourier Transform that operates on these principles.
The Fourier Transform is specifically looking for signal energy at a series of spot frequencies that are integer multiples of the lowest. In other words the frequencies are the lowest multiplied by 2, 3, 4, 5 and so on. One coefficient will be created for each frequency.
Fig.3a) shows how the Fourier Transform works. A waveform having a specific frequency, known as a basis function, is generated and it is multiplied together with the signal to be analyzed. This produces sidebands. If the signal being analyzed contains energy at the specific frequency, the lower sideband will have a frequency of zero Hertz and so will pass though a low-pass filter or averager. Any other components present will be modulated to some non-zero frequency that averages to zero as Fig.3b) shows. The level of the output signal represents the amount of energy at the specific frequency and creates a coefficient for that frequency.
Fig.3 - At a) the product of two equal frequencies will contain a DC component that can pass through an averager. At b) if the frequencies are different there is no DC component.
In practice this simple scheme will only work if all of the signals concerned are in phase. In real life this doesn't happen. If a multiplication is attempted between two identical frequencies that happen to have a 90 degree phase relationship, the result will be zero and the transform will not see the frequency it is looking for.
Accordingly, the Fourier Transform has to be arranged so that it can see signals of any phase. This is done by analyzing every frequency twice; once with a sine wave and once with a cosine wave, resulting in two coefficients for each frequency. Fig.4a) shows an input signal of some arbitrary phase. It can be resolved vectorially as the sum of a sine wave and a cosine wave.
Fig.4 - At a) a signal of any phase can be created by adding together sine and cosine waves in varying proportions. At b) the mechanical analog computer used in three cylinder steam locomotives, which added the motion of the two valve gears to produce motion at 60 degrees and inverted it to obtain 120 degrees phase.
Vectorial summation is hardly new, and was used extensively in composite video. Long before that, Fig.4b) shows that it was used in steam locomotives. Nigel Gresley's Mallard, which still holds the world speed record for steam, had three cylinders phased around the crank every 120 degrees, but it only had two sets of valve gear. The valves for the inside cylinder were controlled by vector summation of the outside valve motion.
In an ingenious mechanical analog computer devised by Harold Holcroft, the +/- 60 degree valve drives for the outside cylinders were summed and divided by two to produce valve drive at 0 degrees phase, and then inverted, to produce valve drive at 180 degrees. The system was adopted world wide.
You might also like...
What we’ve seen as ATSC 3.0 deploys and develops is just the tip of the NextGen TV iceberg.
Thus far we have looked at transforms from a somewhat abstract viewpoint. In contrast, here we look at an application where transforms take center stage.
Broadcasters are experimenting with many new TV business models to monetize new NextGen TV technologies.
While the amount of content generated by broadcasters and program providers continues to increase exponentially, the need to manage those assets and support collaborative production as well as a myriad of distribution platforms has never been greater. The pressure is…
As we saw earlier when discussing transform duality, when something happens on one side of a transform, we can predict through duality what to expect on the other side.