How Latency Affects Computer Audio Editing Systems

Latency is the millisecond delay between the time an audio signal enters an electronic system and when it emerges. Though a simple concept to understand, latency can play havoc in a range of audio applications, including well-known issues in live broadcasting.

Latency can be caused by many factors, including both analog-to-digital and digital-to-analog conversion, buffering, digital signal processing, transmission time and the audio speed in the transmission medium.

This delay can be a critical performance consideration in several pro audio applications. Included are recording interfaces with personal computers, sound reinforcement, in-ear monitoring systems and live broadcasting. For example, that annoying delay between a questioner in the studio and a reporter in the field on a live TV news remote is one kind of latency.

In this article, however, we will focus on latency issues in basic audio editing systems using a personal computer. Latency often comes between an audio interface and the computer itself. The goal is to lessen the delay to times that are too short for the human ear to detect. Since around 6ms is audible as a delay, latency needs to be below this threshold.

We can never get rid of all latency. Despite very fast computer processing, users can still get a few milliseconds latency as the audio travels from the interface, through the computer and then sent back to the interface to be played out of the computer. The sound must travel through multiple processors and wires to get to the listener's ears.

The good news is with the latest editing systems, most latency is unnoticeable by users. If it remains consistent, it can be fixed in post-production. This has been aided by ever faster computers and speedy connectivity.

Thunderbolt Connection Cable

Thunderbolt Connection Cable

There are a grab bag of tricks to tackle latency. Super-fast Thunderbolt connectivity between devices has helped. By adjusting buffer sizes, latency can also be reduced. Direct monitoring can help solve the problem. This is when the input is sent straight to the headphones, greatly reducing the latency.

Other causes of latency can be found within the processing of the audio — either in the converters or plugins. A step that can be taken to reduce latency is to decrease the number of active processors and plugins being used simultaneously in the system. This can be done either by waiting until mix down to apply them or by rendering the audio tracks with effects turned on so the processing power is reduced.

As to buffer sizes, computers at times process large amounts of simultaneous data. This can cause a problem for audio, which requires a constant stream of samples due to sound being a continuous waveform.

Buffer size is dependent on factors such as how many plugins are loaded on a track, and the computer’s processing power. If the buffer size is too low, users may encounter errors during playback or may hear clicks and pops. If the buffer size is set too high, a lot of latency can be heard, making the process frustrating.

UA Apollo X4 Audio Interface

UA Apollo X4 Audio Interface

When introducing additional audio to an editing session, a larger buffer size might be needed to accurately record the signal. This increase in buffer size will allow more time for the audio to be captured without distortion.

It is important for users to find the appropriate buffer size for the session as this can vary depending on the number of tracks, plugins or audio files. Sample rate can also be used to increase or decrease latency. This is the number of audio samples captured per second. Most audio pros set the sample rates at 48 kHz, while the bit depth is set at 24.

Experimenting with both buffer size and sample rate is a good way to effectively deal with latency. When both recording and editing audio, these settings are also important to the sonic quality of your audio. The balance between optimal performance without latency and audio quality is important.

The good news is audio interfaces have improved dramatically in recent years. Most now come with plugins and drivers that are optimized to lower buffer sizes with higher stability in larger sessions. A good, brand name audio interface can dramatically reduce latency problems. 

You might also like...

Audio Post For ‘Billions’ With Eric Hirsch & Gregg Swiatlowski At Goldcrest NY

The multi award winning team at Goldcrest share their creative insight and technique through an exploration of the subtle soundscape for Billions.

Podcast Studio Design: Getting The Room Right

Podcasting is an increasingly popular pastime in the U.S., with an estimated 120 million podcast listeners in 2021. Back in 2006, only 22 percent of the adult population was aware of podcasting. By 2021, this figure had risen to 78 percent. Podcasting is becoming an…

Broadcast Audio Workflow: Part 2 - Entertainment With An Audience

We continue our discussion of broadcast audio workflow with multi-award winner Robert Edwards. We look at the many challenges that come when a live audience is added to the broadcast mix.

Electricity: Part 7 - Lithium Batteries

Lithium batteries are all the rage on account of their low weight and high capacity. But how good are they really?

Cloud-Native Audio Mixers - Current Developments In Virtualized Broadcast Audio Mixing

As the wider broadcast industry picks up the pace with virtualized, cloud-native production systems we take a look at what audio vendors currently have available and what may be on the horizon.