I guess this is a question about digital audio techniques. Any idea, even if you're not sure, please shout, I bet you know more than me that's for sure.
Say you have a digital audio processing circuit, it doesn't matter what make the DSP is or what make the ADC/DAC chip is (commonly called a CODEC if its on the same chip), but if the sampling rate is 192Khz that means the period between samples is 5.2 microseconds.
From what I've read the DSP usually takes data from the ADC in batches to capture at least 1 wave cycle of audio, or probably several cycles, before changing it with some algorithm and passing the data to the DAC to be turned back to an analog approximation.
Say the effect is not a delay type effect, does that mean that the DSP has to process all that data and pass it onto the DAC in less than 5.2 microsecs? I'm getting a bit confused because if the DSP takes longer than one period of the sample rate, the reconstituted analog output will get delayed. 5.2 microsecs is not that long, even if the sample rate is the more common 96Khz and the period is about 10microsecs, that's still not much time to do a lot of computation I think. May be I'm wrong and DSPs are pretty quick?
Say the DSP only processes after every few cycles of audio, but takes longer than 1 period of the sample rate to do it's computation, then the delay at the reconstituted analog output will get progressively larger?
I think my understanding is a bit screwed up. Please help me out, thanks.
My next problem will be how to solder a 64pin low profile quad flat packSMD with pin spacing too small to see without an electron microscope. I'll post about that later
Say you have a digital audio processing circuit, it doesn't matter what make the DSP is or what make the ADC/DAC chip is (commonly called a CODEC if its on the same chip), but if the sampling rate is 192Khz that means the period between samples is 5.2 microseconds.
From what I've read the DSP usually takes data from the ADC in batches to capture at least 1 wave cycle of audio, or probably several cycles, before changing it with some algorithm and passing the data to the DAC to be turned back to an analog approximation.
Say the effect is not a delay type effect, does that mean that the DSP has to process all that data and pass it onto the DAC in less than 5.2 microsecs? I'm getting a bit confused because if the DSP takes longer than one period of the sample rate, the reconstituted analog output will get delayed. 5.2 microsecs is not that long, even if the sample rate is the more common 96Khz and the period is about 10microsecs, that's still not much time to do a lot of computation I think. May be I'm wrong and DSPs are pretty quick?
Say the DSP only processes after every few cycles of audio, but takes longer than 1 period of the sample rate to do it's computation, then the delay at the reconstituted analog output will get progressively larger?
I think my understanding is a bit screwed up. Please help me out, thanks.
My next problem will be how to solder a 64pin low profile quad flat packSMD with pin spacing too small to see without an electron microscope. I'll post about that later