Why do digital systems get audio latency?

Discussion in 'Off-Topic' started by mooseboi, Jul 2, 2014.

  1. mooseboi

    Thread Starter New Member

    Jul 2, 2014
    0
    0
    Why do digital systems get audio latency and other analogue systems like valve guitar amps not produce latency?

    What electrical components make this latency?
     
  2. Alec_t

    AAC Fanatic!

    Sep 17, 2013
    5,773
    1,103
    What digital systems are you talking about?
     
  3. alfacliff

    Well-Known Member

    Dec 13, 2013
    2,449
    428
    because the analog to digital conversion process both ways takes time.
     
  4. Papabravo

    Expert

    Feb 24, 2006
    10,136
    1,786
    What makes you think that analog audio systems like "valve guitar amps" have no latency? They most certainly do have latency.
     
  5. alfacliff

    Well-Known Member

    Dec 13, 2013
    2,449
    428
    the analog audio systems have latency thats much shorter than digital. in "valve" audio systems, the electron transit time and actual time electrons have to flow is the latency, in digital systems, there is the a to d time, the delay thorough each chip, the processor cycle time, how much time the program takes to execute its program, and then the digital to analog delay time. in digital two way rsdio systems, if you get two radios to close together, when one is transmitting, you can heat the echo effect of the latency, like the old echo effects using a reel to reel tape recorder with seperate record and playback heads used to do.
     
  6. alfacliff

    Well-Known Member

    Dec 13, 2013
    2,449
    428
    one way that quarter eating video games got around the delay from their form of latency was to use multiple cpu's for different parts of the program, such as one for generating sounds, one for switch inputs, and such, that way interupts would not slow down the video game play.
     
  7. NorthGuy

    Active Member

    Jun 28, 2014
    603
    121
    Software filter (such as FIR) must receive the input, process it, then output the result. It cannot produce any result until it gets enough input samples, which produces a small delay, but it is hardly noticeable.
     
  8. Papabravo

    Expert

    Feb 24, 2006
    10,136
    1,786
    "Much shorter than digital", but at what clock rate. It is my contention that digital systems at sufficiently high clock rates will be indistinguishable from analog systems.
     
  9. alfacliff

    Well-Known Member

    Dec 13, 2013
    2,449
    428
    current state of the art motorola radios have a definaly noticable latency. I am not too sure what the clock rate is in them, but they work.
    I have never noticed any latency in an analog audio system. measureable should be in the milisecond range or less. electrons move qretty quick through systems, but conversion of analog to digital, processing, and back to analog does take more time.
     
  10. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    Isn't that caused by packet transmission, repeating packets etc?

    That's a different thing to an analogue/digital comparison, and re-transmitting packets etc can get very slow depending on comms quality.
     
  11. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,907
    2,163
    I would qualify that to say 'indistinguishable' by the human ear. Jitter, quantization errors, and aliasing can be reduced to levels undetectable by the human ear.
    I have a collection of the now obsolete DVD-Audio disks that had 96khz 24bit encoding for 5.1 surround for the home and car. In some ways they were too good as you could ear every little misstep in the music.

    http://forum.benchmarkmedia.com/dis...nique-evils-digital-audio-and-how-defeat-them
    https://www.meridian-audio.com/w_paper/mlp_jap_aes9_1.PDF
     
    Last edited: Jul 3, 2014
  12. Papabravo

    Expert

    Feb 24, 2006
    10,136
    1,786
    Construct an experiment to measure latency or delay so that we have an input signal to a black box, and an output signal from a black box. If we look at the result of that experiment in terms of time delay measured in seconds and cannot identify the contents of the black box then we have removed the human ear from the question entirely. It is still my contention that you would not be able to distinguish an analog system from a digital system.

    Let the OP specify an objective standard where such an identification would be possible.
     
  13. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,907
    2,163
    In isolation the latency or delay of a single analog channel due to digital processing might not be a problem but when you have several sources that all contain common sourced signals at some level like when mixing several microphones in a studio with different digital effects on each microphone accounting for DSP delays is important because it causes phase shift/comb-filtering between channels that the ear is really sensitive to. Most Pro level digital systems automatically adjust and equalize delays to prevent this when effect converters/plugins/mixers are switched in and out.

    Another thing that can identify digital systems is how they clip with overload. The purest digital system has a clipping signature that is unmistakable.

    http://productionadvice.co.uk/clipping/
     
  14. alfacliff

    Well-Known Member

    Dec 13, 2013
    2,449
    428
    since latency is defined as delay through a system, latency on a cd or tape would be the time between when it was recorded and when it was played back, or the time in the store before it was sold and played back.
    latency in a digital system from a microphone or instrument to speaker is determined by mostly adc and dac time plus the processing time. latency in radio systems is mostly from delay in the codec.
     
  15. MrChips

    Moderator

    Oct 2, 2009
    12,421
    3,356
    Suppose you digitized hi-fi music at 44.1kHz and you needed to attenuate frequencies above 20kHz using a low-pass filter. You would need to acquire at least 50μs of data in order to capture one full cycle at 20kHz.

    Now suppose you wanted in design a notch filter to remove 60Hz hum. You would need to acquire at least 16.667ms of data in order to capture one full cycle at 60Hz.

    If you needed a high-pass filter in order to reject frequencies below 10Hz, you would need 100ms of data and 100ms is a relatively long time.
     
  16. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    I'm going to argue that. DSP doesn't need one full cycle to have a filter effect on the sound, it can do it the same way analogue does; by affecting rate of change of the signal etc.

    With a standard 44.1kHz digital system the ADC sampling "delay" is 1 sec / 44100 and the DAC delay can be a bit more, or even less, depending on the lag caused by the output filter. Total delay about 45uS, totally indetectable by a human.

    How much "delay" is caused by charge times in series caps in a typical analogue preamp/ amplifier signal chain?
     
  17. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,907
    2,163
    Don't underestimate what we can hear. Our brains sound processing evolved as a critical part of the species survival.

    Humans can decode sound direction using phase with a pair of transducers (ears) that have a delay between them of only 300-400us (the speed of sound across our head) with a timing precision of about 30us when we locate a person size object within a few degrees at 10 meters just from sound by turning our heads to locate it.

    http://www.analog.com/en/processors...nt/content/scientist_engineers_guide/fca.html
     
    Last edited: Jul 4, 2014
  18. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    5,435
    1,305
    Agreed! Human ears can do some impressive stuff.

    But you can't "hear" the 45uS caused by the digital process, unless you are listening to the source sound and the digitally reproduced sound at the same time.

    And if that's the case, the delay from the sound getting from speakers to your ears is 1mS per foot distance, so analogue delays from speakers->ears are massive compared to any delays caused by the digital processes.

    And I would guess there are very few humans who can tell that one speaker is 12" further away from their ears than the other speaker, because they can detect the 1mS delay from the more distant speaker.
     
  19. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,907
    2,163
    For mono signal source systems that's true if the signal chain effects all frequencies equally but most surround/multi-channel systems have time delay settings for each speaker or channel to comp for speaker distance analog delays because while we can't tell the absolute delay to range a sound we can easily detect delays on the same sound like a vocal from different directions from the right to left speakers. We normally would say the sound stage image is distorted when this happens and the voice seems to move in space rather that being stationary.
     
    Last edited: Jul 5, 2014
  20. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,907
    2,163
    I can show the change in source sound image quality by only changing speaker delays on my cars 5.1 system. (poorly because of the limited range and channels of the video)

    https://flic.kr/p/od3YtR
    http://www.kenwood.eu/feature.aspx?id=1040
     
    Last edited: Jul 5, 2014
Loading...