Does anyone have any ideas on how to create an adjustable analog delay (up to 150uS delay ) for analog signals in the dc to 20khz ,0-5V range , accurately delaying the signal with no amplitude degradation and linear phase. im trying to simultaneously sample current and voltage of a signal and the allegro acsxxx series current monitor ic has about a 6us i/o delay. I have to delay the voltage sensing op amp signal so i can multiply the two V&I samples together to calculate power. Obviously the samples must be exactly in phase to be accurate. Signals are sampled into an adafruit m4 microcontroller with internal adc's which i want to use and there is an additional delay of about 90uS between the 2 adc reads that i have to compensate for (so about 100uS total) . i could do it digitally with external adc's triggered the 6us apart but that would add cost, was hoping there would be some way to do this in the analog realm so i could use the onboard adc's but i am not finding any solutions. i can't time align the signal digitally in software because my samplerate will only be about 5Khz so can only have a 200us dither which is way too coarse. The largest Adjustable delay step size tolerable is about 400nS (preferably lower) which would give me about 5% error at 20khz ( if my calcs aren't wrong) so i need delay adjustments to this resolution or better. ideas?