Phase delay vs changing time

Thread Starter

Xingpeng Chen

Joined Nov 2, 2015
New to the forum, so for my school project I have two 40 khz sine wave input,both signals will be repeatedly on for 3ms then off for 3 seconds, there will be a very small time delay(might be in scale 1us). And my object is to find this time difference.
I have been looking in some different ways :
1. use rectifier to rectify inputs, and deliver the square wave to Arduino, then Find the time different by comparing events of changing edges.
2. someone told me to use a phase detector, where the output contain the information of the two inputs. I have never used them before, I wonder how should i use the output from phase detector in my micro controller.
3. Which way will give me more precise result?


Joined Aug 1, 2013
There are several ways to do this. Here are a few:

1. As you said, use diodes to convert the sine waves to square waves, feed them into a Arduino, and let the computer count the time between two positive edges.

2. Sum the two sinewaves. When they are perfectly in phase (0 degrees shift) the sum will be twice the amplitude of the input. When they are 180 degrees out of phase the sum will be 0 V. From other output voltages you can calculate the phase shift and hence the time lag.

3. Rather than count the time interval between the two edges, use an integrator to produce a voltage that is proportional to the time lag.



Joined Mar 14, 2008
If you want to measure the time difference between the two inputs, I recommend using two high-speed comparators (the common LM339 is too slow) to convert the sine-waves to square-waves and accurately detect the zero crossings of the two.

For AK's Technique No. 2, it would be most accurate if you invert one of the signals so that you are measuring a small voltage when the two signals have a small phase-difference.
Note that the two signal amplitudes must be identical for minimum error with this technique.