RF Distance Measurement Techniques (ToF+Interferometry)

Thread Starter

¡MR.AWESOME!

Joined Aug 15, 2010
33
I am trying to design a system to measure the distance between two RF transceivers with as small of a resolution as possible. Time-of-flight measurement is nice and easy to do, but it has a poor resolution with low frequency timers. I am thinking of using a low-cost mcu that will probably have a clock rate of ~30Mhz. So the best possible resolution I could get would be (1/3x10^7)*c=~10m/s. I would like to get much better resolution than that.

I did some searching around and found some info on RF interferometry. Basically, you create a wave and split it in two. The first wave goes out and does it's thing and then comes back. The second one stays where it is. You then compare the phases of the two. Depending on the wavelength, you can determine the distance that the first wave traveled by the amount the phase shifted compared to the second wave.

Obviously, this will only be accurate if the distance traveled is less than one wavelength. By combining time-of-flight measurements with phase difference measurements, could you not get a much higher resolution? It seems to me that distance measurement with interferometry is almost exclusively done with lasers or other forms of light, which is only good for very small distance measurements. Why is this? I must be missing something.

Some of the posters in this thread touched on using this technique with RF, but didn't give much info.
 

crutschow

Joined Mar 14, 2008
34,412
It would seem that your scheme might work, but how do do the measurement is a good problem. Here's a reference that discusses a common technique that does this for microwave frequencies.

Certainly, in principle you could do a high speed A/D conversion of the two signals and compare both the flight time and phase of the returned signal to the reference oscillator signal. The number of samples/sec. would be one of the main factors for the minimum accuracy of comparing the phase and thus calculating the distance. Since the speed of light is close to 1ns/ft you can see that the A/D speed can get very high, depending upon the resolution you need. Another factor affecting the accuracy is the signal to noise ratio of the returned signal.
 

Thread Starter

¡MR.AWESOME!

Joined Aug 15, 2010
33
Thanks for the responses guys.

I have done some reading on the Tellurometer, but I wasn't able to find any technical text on the principles behind it's operation.

As usual, I wasn't finding much until I started using the right terms. I came across the term "Time to Digital Converter" and I have found much more since then. A good overview of different methods used in TDCs is found here.

After looking at those systems, I think that the interferometry method is much more complex and probably not worth the added complexity.
 

JMW

Joined Nov 21, 2011
137
I realize this is month late. Early WWII RADAR operated at 6 meters or so.
Over the Horizon operates near 18 meters. Granted the resolution isn't that good, but it works. Your problem will be getting authorization to use these frequencies.
 

John P

Joined Oct 14, 2008
2,026
Suppose you had an outgoing wave with a frequency that swept over a certain range and then repeated. You'd have a certain number of cycles "in transit" (1GHz would mean 1 wave per foot) but you could tell how many by the frequency difference between the outgoing and returning signals. Would that work?
 

Thread Starter

¡MR.AWESOME!

Joined Aug 15, 2010
33
Yea I guess you could take the frequency and convert it to an analog voltage and then to a digital number. The biggest immediate problem I see with that method is the bandwidth required. Interference from some other source could easily wreak havoc on that method.

I haven't been working on this problem lately, but when I have some time to get back to it, I plan on using some sort of TDC circuit.

In another forum where I asked the same question a guy told me of a way he did it many years ago. From what I got out of it, he had two sine waves which were 90 degrees out of phase to each other and an ADC to measure each one. There was also some mechanism to count how many cycles had elapsed. So when you hit 'start' the cycles would begin to be counted. When you hit 'stop,' the ADC's would capture their values and the values would be stored. Then a program would compare the counts of cycles to determine which ADC value it should use based on which sine wave was at it's most linear point when the capture happened. If you look at a quadrature sine like so, you should be able to figure it out.
 
Top