# RF Distance Measurement Techniques (ToF+Interferometry)

#### Â¡MR.AWESOME!

Joined Aug 15, 2010
33
I am trying to design a system to measure the distance between two RF transceivers with as small of a resolution as possible. Time-of-flight measurement is nice and easy to do, but it has a poor resolution with low frequency timers. I am thinking of using a low-cost mcu that will probably have a clock rate of ~30Mhz. So the best possible resolution I could get would be (1/3x10^7)*c=~10m/s. I would like to get much better resolution than that.

I did some searching around and found some info on RF interferometry. Basically, you create a wave and split it in two. The first wave goes out and does it's thing and then comes back. The second one stays where it is. You then compare the phases of the two. Depending on the wavelength, you can determine the distance that the first wave traveled by the amount the phase shifted compared to the second wave.

Obviously, this will only be accurate if the distance traveled is less than one wavelength. By combining time-of-flight measurements with phase difference measurements, could you not get a much higher resolution? It seems to me that distance measurement with interferometry is almost exclusively done with lasers or other forms of light, which is only good for very small distance measurements. Why is this? I must be missing something.

Some of the posters in this thread touched on using this technique with RF, but didn't give much info.

#### xylon89del

Joined Dec 28, 2011
17
Talking about distance measurement, it reminds me of radar which usually uses microwave..it is certainly not easy to design..

#### crutschow

Joined Mar 14, 2008
27,016
It would seem that your scheme might work, but how do do the measurement is a good problem. Here's a reference that discusses a common technique that does this for microwave frequencies.

Certainly, in principle you could do a high speed A/D conversion of the two signals and compare both the flight time and phase of the returned signal to the reference oscillator signal. The number of samples/sec. would be one of the main factors for the minimum accuracy of comparing the phase and thus calculating the distance. Since the speed of light is close to 1ns/ft you can see that the A/D speed can get very high, depending upon the resolution you need. Another factor affecting the accuracy is the signal to noise ratio of the returned signal.

#### studiot

Joined Nov 9, 2007
4,998

It was a South African system that worked on the principles you described.

#### Â¡MR.AWESOME!

Joined Aug 15, 2010
33
Thanks for the responses guys.

I have done some reading on the Tellurometer, but I wasn't able to find any technical text on the principles behind it's operation.

As usual, I wasn't finding much until I started using the right terms. I came across the term "Time to Digital Converter" and I have found much more since then. A good overview of different methods used in TDCs is found here.

After looking at those systems, I think that the interferometry method is much more complex and probably not worth the added complexity.

#### JMW

Joined Nov 21, 2011
134
I realize this is month late. Early WWII RADAR operated at 6 meters or so.
Over the Horizon operates near 18 meters. Granted the resolution isn't that good, but it works. Your problem will be getting authorization to use these frequencies.

#### John P

Joined Oct 14, 2008
1,892
Suppose you had an outgoing wave with a frequency that swept over a certain range and then repeated. You'd have a certain number of cycles "in transit" (1GHz would mean 1 wave per foot) but you could tell how many by the frequency difference between the outgoing and returning signals. Would that work?