I'm starting a project trying to measure short distances (less than one meter) using a laser pulse to illuminate a target and later measuring how long it takes for its reflection to activate a photodiode placed just beside the laser source.
I found this diode that has a 2ns response time:
http://www.semicon.panasonic.co.jp/ds4/PNZ331CL_AED_discon.pdf
Now:
A) light travels at 300,000,000,000 mm/sec. That means that it takes 3.33 x 10-12 sec for it to travel one mm
B) The distance I'd like to measure is, say, 500mm away (one meter, back and forth). That would be one meter back and forth. So light would take 1.67 x 10-9 seconds to cover that distance.
So, if I wanted to measure distances with a 500mm resolution (one meter, back and forth), all I would need is a microcontroller working at a little more than 600 Mhz that would count the clock ticks it takes between its activating the laser signal and detecting the photodiode's response. That is, taking into account also the laser's ramp-up time, which should have fairly constant repeteability (yeah, right)
That would also mean that all I would need to do to improve resolution would be to increase the MCU's frequency, which would be fairly easy, right?
BUT... something tells me that I might have made a very basic mistake in these assumptions, and that the problem might be far more complicated than it seems.
I'd like to hear anyone's opinion on this problem.
I found this diode that has a 2ns response time:
http://www.semicon.panasonic.co.jp/ds4/PNZ331CL_AED_discon.pdf
Now:
A) light travels at 300,000,000,000 mm/sec. That means that it takes 3.33 x 10-12 sec for it to travel one mm
B) The distance I'd like to measure is, say, 500mm away (one meter, back and forth). That would be one meter back and forth. So light would take 1.67 x 10-9 seconds to cover that distance.
So, if I wanted to measure distances with a 500mm resolution (one meter, back and forth), all I would need is a microcontroller working at a little more than 600 Mhz that would count the clock ticks it takes between its activating the laser signal and detecting the photodiode's response. That is, taking into account also the laser's ramp-up time, which should have fairly constant repeteability (yeah, right)
That would also mean that all I would need to do to improve resolution would be to increase the MCU's frequency, which would be fairly easy, right?
BUT... something tells me that I might have made a very basic mistake in these assumptions, and that the problem might be far more complicated than it seems.
I'd like to hear anyone's opinion on this problem.
Last edited: