The Truth About Latency In Digital Audio

If you have ever worked with digital audio, it is very likely that you have heard the word latency a couple of times. But there is so much mystery around it that it seems important to tell the actual truth about what it is and how it may affect your work, as an audio professional or musician.

What Is Latency?

It is the delay between when a sound is produced and when it is actually heard by the listener.

Here is the Wikipedia definition to get into more details.

In digital audio, this delay can increase across the signal chain and cause all sorts of issues.

Where Is Latency Produced?

Latency is everywhere. It is first produced in the acoustic world by the speed of sound in the air: if you are away from a sound source, the latency may be noticeable, especially if you have visual clues (light travels much faster than sounds).

In digital audio, any Analog/Digital or Digital/Analog conversion adds latency. That’s one of the reasons why you usually avoid daisy-chaining too many digital devices thru analog connections. Once the signal is digital, keep it digital!

Another common source of latency is the plug-ins (digital effects) that you may use to process the sound. Some algorithms do have latency because of their nature (look-ahead compressors, linear phase EQs…). So they will add some more delay to the incoming signal.

Can Latency Be A Problem?

Yes. Although latency is everywhere and our brain is used to handling it, there are a couple of a situations where it can be a real problem.

When Tracking/Recording

If you are monitoring yourself thru digital equipment and effects while playing an instrument: hearing yourself with latency can make it difficult to play an instrument, especially on instruments with lots of transients such as drums.

About drums, you can check this article on how to use plugin and drums.

When Playing with Other Musicians

It is hard to keep the beat and play together if there is too much latency between the musicians, as they will all keep adapting their playing to what they hear from others.

With too much latency, in the best case you will usually end up gradually slowing down the tempo, and in the worst case it is just not possible to play at all!

When Mixing Sound Sources Together

Mixing a sound with a delayed version of itself produces comb filtering (completely removing some frequencies), so you usually want to avoid doing that:

Comb Filter produced by summing the same signal with a delayed version of itself

That’s why multi-microphone takes are quite difficult to  setup: you want to make sure that all mics have the same latency for the main parts you are recording, or they will get filtered.

What Is Latency Compensation?

When applying plug-ins that produce latency, you want to avoid mixing the delayed signal with itself to avoid comb filtering, as explained above.

So in most (if not all) Digital Audio Workstations (DAW), the latency produced by plug-ins can be compensated by the software (also known as PDC, “Plug-in Delay Compensation”). When a plug-in adds latency on a track for example, it will delay all the other tracks by the same amount, so that they are all in sync. It is not magical though: the latency is not removed, it is actually added to ALL signals to keep them in sync:

Latency compensation is definitely necessary, even with the shortest latencies, as a single sample delay between two identical tracks will dampen the higher frequencies. You could however cope without compensation if the latency is short enough and there is no chance that coherent audio signals may be mixed together with some latency anywhere in the signal chain.

The Physical Truth

So what’s a “good” or “bad” latency? Let’s see what physics says!

The sound travels in the air at a speed around 340 meters per second. It means that every meter you move away from a sound source adds a latency of around 3 milliseconds. It make sit pretty easy to compute the equivalent distance for any latency:

distance ~= latency(ms)/3

In other terms, here are a few references:

3ms <-> 1 meter
10 ms <-> 3 meters
20 ms <-> 7 meters
30 ms <-> 10 meters

So this means that with a roundtrip latency of 10 ms, it as if you were playing 3 meters away from your speakers, which  usually does not matter much!

When playing with headphones though, your brain expects the sound source to be very close, so the latency may be more noticeable than when you play on speakers (and there is no extra room reverberation either to hide it).

And of course, a drummer does not expect playing cymbals 10 meters away from his/her head, so the truth is not the same for everyone!

The Musical Truth

And What about musical time? How does latency affects a performance?

At 120 bpm, a quarter note is 500 ms long (two beats/quarter notes per second). So we have the following equivalences:

120 BPM
1/4 note <-> 500 ms
1/8 Note <-> 250 ms
1/16 Note <-> 125 ms
1/32 Note <-> 62 ms
1/64 Note <-> 31 ms
1/128 Note <-> 15 ms

As you can see, in musical times, latency can be larger without disturbing the performance too much.

Conclusion

Latency is everywhere, even in the analog world and our brain is used to dealing with it. Within a DAW, even the shortest latency has to be compensated to avoid artefacts and bad interactions between tracks.

For a musician hearing himself thru a digital equipment with effects, the latency can be greater without any problem, as in most cases they are used to having their hears away from the instrument or amplifier. But it may vary: every human being is more or less sensitive, and some instruments (such as percussions) will also be more difficult to play with higher latencies.

When playing with other musicians (for example in a very large room or thru a remote network connection) larger latencies (up to 50-60 ms) are still bearable, unless you play sixteenth notes at 400 BPM!

Tell us what you think on the forum!

Leave a Reply

Your email address will not be published. Required fields are marked *