# Loudspeaker Technology Part 5: The Aperture Effect

Leedh E2 Glass speaker system. It is marketed as, “The First Truly Holographic Loudspeaker. Price, 16.000 €/pair.

John Watkinson continues his multi-part tutorial series on loudspeaker design. In this article, he looks at the aperture effect and why it represents a challenge to loudspeaker designers.

The *Aperture Effect* is one of those phenomena that crops up in practically every aspect of technology and loudspeakers are no exception. In most technologies, there is the concept of an ideal device and in many cases the reason such an ideal device cannot be realised is the existence of aperture effects. In most technologies the results of the aperture effect are understood and adequate compensation is applied so the overall result is satisfactory. There is much evidence that in loudspeakers the aperture effect is not sufficiently recognised or understood and architectures are used that are incapable of compensation. The problem is swept under the carpet, with the result that the psycho-acoustic criteria for accurate reproduction simply are not met. It is one of the key reasons that very few loudspeakers sound at all like the original sound, or indeed like one another.

In short, an aperture effect will occur any time a mathematically ideal system requiring an infinitely small point has to be realised with actual hardware. In analog audio disks, the ideal stylus has zero length, so it can follow the finest detail of the groove. Unfortunately the contact pressure would be infinite and instead a stylus of finite size must be used, that results in an aperture effect.

Genelec 1234 SAM. Makers of professional monitors are experts at the physics behind audio reproduction.

The gap in an analog tape replay head ought to be zero but it isn’t. Sampling theory, as explained by Shannon and others, relies upon the samples being taken and reproduced over an infinitely short time or space when the output and input waveforms become identical. In image sampling, this requires a photosite on the sensor to be infinitely small, and that contradicts the requirement to gather light efficiently. The latter requirement has to win out, so the sampling becomes non-ideal. But in what way?

The answer can be approached in the frequency domain or in the time/spatial domain and the result must be the same. An aperture effect is a kind of filter and it is well known that the spectrum at the output of a filter is the product of the input spectrum and the filter response. When infinitely short samples are replaced by finite rectangular pulses, the spectrum is modified by the spectrum of the pulses. A rectangular impulse response is a characteristic of the moving-average filter, used in graphics to simulate loss of focus. Rectangular pulses have a sin X/X spectrum, which has periodic nulls, explaining why a square wave has no even harmonics. In the case of a CCD camera chip, the light gathering area must be maximal, which means that the rectangle practically fills the space between the samples. This causes the first null to be at the sampling frequency.

In the time/spatial domain, the interaction of the impulse response of a filter and the input signal is obtained by *convolution*.

In convolution, the input signal is swept through the filter impulse response and the instantaneous output is proportional to the area of overlap. Figure1a, shows a low frequency sine wave convolved with a rectangular impulse. The area of overlap is practically 100% of the signal so there is no reduction of output. However, consider Figure 1b) in which the frequency has risen. Now the area of overlap is smaller and the response falls. Self-evidently, the broader the impulse, the steeper the filter curve.

The principle of convolution. At a, the area of overlap (shaded) is essentially proportional to the input waveform and there is no loss. At b, the wavelength is shorter and the area of overlap is reduced. Click to enlarge.

**Speaker diaphragm size**

What has this to do with loudspeakers? Well, the effects described above are to do with wave length, so it should be unsurprising to find the same thing happening with sound. Figure 2a shows a conceptual loudspeaker that has a diaphragm of zero diameter. It will radiate the same sound in all directions and so is ideal. Unfortunately, a zero area diaphragm cannot generate any volume velocity, so although the theory is good, practical loudspeakers have to use a diaphragm of sufficient area to obtain the required sound level.

a) The zero diameter diaphragm radiates ideally, but cannot produce any sound pressure. b) The finite diameter diaphragm works fine straight ahead. c) Off axis, the finite diaphragm causes an aperture effect, where the wavelets arrive over a finite span of time. Click to enlarge.

It was known to Huygens and Newton that all radiating bodies can be modelled by an array of point sources. In Figure 2b, the diaphragm is radiating and the radiation is analysed directly ahead. Because radiation from all of the points on the diaphragm has travelled substantially the same distance, the waveforms can be added and the result is substantially the same as the waveform feeding the drive unit.

However, consider Figure 2c. Here the off-axis result is considered. As the distances from the test point and the points on the diaphragm are now all different, the waveforms from each point arrive over a rectangular time window. In other word they have been convolved with a rectangular impulse response which implements a moving average filter.

What happens depends totally on the wavelength. One of the characteristics of sound is that in the case of the lowest frequency we can hear, the wavelength is far larger than any practical diaphragm or speaker enclosure, whereas at the highest frequency we can hear, the reverse is true. The speaker is essentially not there because the size of the diaphragm compared to the wavelength, which is essentially the aperture ratio, is so small. The effect of Figure1a is obtained. The response doesn’t change with direction.

All woofers are essentially omnidirectional at low frequencies and it doesn’t matter if we put the drive unit on the front, the sides or underneath. Nor does it matter if we use more than one woofer, as long as the diaphragms move in and out together.But as the wavelength gets shorter and the analysis point moves further off axis, the rectangular impulse approaches and then exceeds the wavelength, and there is serious loss.

All speakers are directional to some extent. Typical speaker system frequency response versus axis. Top curve: 45 degrees off-axis response. Middle curve: 60 degrees off-axis response. Bottom curve: 75 degrees off-axis response. Click to enlarge.

The frequency response on-axis may be ideal, but the further off-axis we go, the more high frequencies are lost and the response becomes nothing like the original sound. This is why legacy loudspeakers have a sweet spot and why listening to a loudspeaker from the next room is such a powerful test. Clearly the sound coming through the doorway is reverberant, excited mostly by off-axis energy from the speaker, but in the case of a legacy speaker, the off-axis frequency response will be dreadful. A good friend of mine uses the test to save time at audio shows. He listens through the open door and if the legacy sonic footprint is there, he doesn’t bother going in.

**Speaker directivity**

On a CCD chip, the aperture ratio is constant and can be equalised. In a loudspeaker, the aperture ratio is proportional to frequency multiplied by the sine of the off-axis angle. This directivity issue cannot be equalised, because nothing effective can be done to the waveform feeding a driver that is directionally selective. If you think all loudspeaker problems can be fixed with DSP, think again.

Directivity pattern. Shown here is the directivity pattern for a Soundlabs Visaton R-10-S 4Ohm speaker. Click to enlarge.

We have a fundamental problem because our diaphragm has to be large to have enough volume velocity at LF and small to have wide dispersion at the same time. The traditional solution is to employ a series of drive units whose physical size diminishes as the frequency rises, hence woofers, mid-range units and tweeters. What then happens is that the directivity begins at LF being omnidirectional and as frequency rises the radiation pattern narrows up to a crossover frequency, where it typically widens again. In many directions, there will be an upward step in the frequency response at the crossover frequency.

In a typical two-way bookcase speaker, the tweeter will be a small dome driver, which is incapable of sufficient volume velocity to allow a low crossover frequency and which has problems of its own. The woofer is forced to operate at frequencies where it suffers beaming. Most of these speakers have a characteristic sound as if the input spectrum was cut in half and never put back together again. Some of the problem is due to directivity and some is due to the use of crossovers that simply don’t work well enough. The defects of crossovers and domes will have to wait for another time.

The physics of directivity explains why claims made in the hi-fi world that extraordinary bandwidth is needed are fallacious. Real sound sources, microphones and loudspeakers all have directivity functions determined in the same way. If we know that 20kHz is an adequate audio bandwidth, we have some chance of sound radiated by a real source headed toward the microphone, some chance of the microphone picking it up within its directivity function and some chance of our ears being inside the directivity function of the speaker. Let us say that probability is unity.

If instead we believe 40kHz bandwidth is needed, then the directivity function of all three elements will be halved in angle and the area of the function will be one quarter of the former value. Thus the probability of hearing 40kHz with the same system is ¼ x ¼ x ¼ as high, or 1/64.

So even if the HAS (Human Audio System) could hear 40kHz, the chances of it being received at the ear are not good. In practice no one can hear 40kHz, and the best way of dealing with such suggestions is omnidirectional laughter.

## You might also like...

# Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.

# Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

# Audio At NAB 2024

The 2024 NAB Show will see the big names in audio production embrace and help to drive forward the next generation of software centric distributed production workflows and join the ‘cloud’ revolution. Exciting times for broadcast audio.

# SD/HD/UHD & SDR/HDR Video Workflows At NAB 2024

Here is our run down of some of the technology at the 2024 NAB Show that eases the burden of achieving effective workflows that simultaneously support multiple production and delivery video formats.

# Standards: Part 7 - ST 2110 - A Review Of The Current Standard

Of all of the broadcast standards it is perhaps SMPTE ST 2110 which has had the greatest impact on production & distribution infrastructure in recent years, but much has changed since it’s 2017 release.