Find distance using laser ?

Thread Starter

CVMichael

Joined Aug 3, 2007
419
Isn't that how a 3D holograph is "recorded" based on the 3D target that it is trying to recreate? With beam splitting, you may be able to accurately measure distance by refocusing the blurred reflection pattern generated by the dual-beam interaction. The focal parameter would then be a function of the distance. Time and material type should then become irrelevant because the beams are referencing each other, both traveling the same distance and through the same material thus cancelling out any density fluctuations or movement of the target.
Do you know where I could find more info about this ? because it sounds interesting...
 

thingmaker3

Joined May 16, 2005
5,083
Lasers are indeed used in very accurate commercial measuring units. If they're not running ultra-fast counters or measuring phase difference, then how do they do it? Their literature claims they rely on the speed of light being constant. The method can't be too awful complex, as the price of the gizmos falls within the budget of many contractors.
 

Mike M.

Joined Oct 9, 2007
104
Do you know where I could find more info about this ? because it sounds interesting...
Here is a start that has some diagrams associated. The setup would have to have a much wider focal distance and range than that of the holographic setup by puting the beams nearly parallel with fine tuning motors attached, like on the new multi-mirror telescopes that adjust each mirror millions of times each second to counter the effect of atmospheric distortion. It would be roughly analogous to the same focal concept except the focus would be done by moving the source to obtain a focused reflection instead moving the return detector geometry and it wouldn't have to be a trillionth as accurate as long as they are Earth-based distance measurements. Seems like it would still be out of the budget of most people though.
 

bloguetronica

Joined Apr 27, 2007
1,541
Lasers are indeed used in very accurate commercial measuring units. If they're not running ultra-fast counters or measuring phase difference, then how do they do it? Their literature claims they rely on the speed of light being constant. The method can't be too awful complex, as the price of the gizmos falls within the budget of many contractors.
Speed of light (in vacuum): 300000000 m / s aprox. The speed of light in air is close to that, as air has very small refractive index when compared to vacuum. Measuring a distance of 500 m would require a travel during 1.667 us. I am not sure if digital devices can respond that quickly.
Also, the speed of light depends on the material. In water, light would be slower. In a diamond light would be even slower, hence why a diamond has a very high refractive index.
 

thingmaker3

Joined May 16, 2005
5,083
Something responds quickly enough! A Hilti model PD28 laser distance finder is as accurate to 1/32" (0.794mm)! I understand we're playing in "Picosecond Park"... but somebody figured out a way to do this. And with a unit cost of less than $700 US.

Even if we assume "1/32 inch" is hype and the thing is only accurate to 1mm, that's still discerning an interval of 3.34 picoseconds reliably. Somebody has a counter (or something) running at 300 GHz...
 

Thread Starter

CVMichael

Joined Aug 3, 2007
419
I had an idea, but I don't know how plausible it is...

So here is my idea:
When I turn on the laser, to give current to a capacitor in series with a resistor, and to stop charging the capacitor when the optical sensor sees the laser light (the light comming back).
When measuring, I can think of 2 ways to do it.
1. To measure the voltage on the capacitor (but just by measuring it, you are discharging it)
2. Or, to measure the discharge time of the capacitor in series with a large resistor (like a few Mega Ohms)

If going for option 2, then it's just a matter of picking the right resistors when charging (a very low resistor, like a few ohms), and the measuring resistor (a few hundred mega ohms), this way takes a lot longer to measure the capacitor than it took to charge it.

Then I can have a table for example, for a distance of 1 cm the capacitor takes 1 milliseconds to discharge, and 10 cm 2 ms to discharge, and from that I have a formula to calcualte the rest of the distances. The formula should be a linear one (in theory).

Again, this is just theory, I don't know how plausable it is, and I don't even know how to start making a circuit for it.

So what do you guys think ?
 

thingmaker3

Joined May 16, 2005
5,083
That might work!

It would require some means of calibration to compensate for component variance. Might also need some kind of temperature compensation or stabilization.

An instrumentation amplifier and a comparator might be in order - very small differences will be being looked for.

Might also need to switch caps (or charging resistors) for different ranges.
 
Top