I'm currently working with a high precision laser measuring device that works through the triangulation principle. A laser is emitted from the device, bounces off the target, and is then detected in a CCD sensor. The sensor's output is then processed and a distance is calculated and reported.
The thing is, that the laser is emitted as a "line" projection (about 9mm wide and 0.3mm thick at the emitter, which diverges into 12mm wide and 0.4 mm thick at a distance of 300m), instead of it being a single dot. And that was a problem when I tried to measure distances to small objects, or to small surface features of large objects.
The sensor reports the average distance to the surface being probed. That is, if the surface has depressions and/or potrusions, these will be taken into account after processing and an average of the measured distances will be reported.
After some experimenting, I discovered that the sensor will report the correct distance even if the surface being probed is smaller than the width of the laser line. For instance, if I point the sensor perpendiculary at the edge of a steel plate having a width of 3mm, the left and right edges of the laser line are lost to infinity, but the 3mm beam is more than enough for the sensor to correctly calculate a distance to that surface.
So that gave me an idea. Since what I wanted was to measure distances on surface features no larger than 2mm, I filtered the 9mm beam through a 1mm wide slit formed by the edges of a couple of razor blades, and thus obtained a projected beam of between 1.5 and 2mm wide, depending on distance. And it works beautifully. Except that the end result is not a perfect 1.5mm wide by 0.3 thick projection, but rather a 1.5mm wide projection plus "ghost" edges to its left and right (the 0.3mm thickness is unaffected) that I attribute to natural diffraction. No mystery there...
Question, is there a technique to completely cancel or eliminate (or at least minimize) this diffraction pattern and obtain a clean 1.5mm x 0.3mm laser line projection?
The thing is, that the laser is emitted as a "line" projection (about 9mm wide and 0.3mm thick at the emitter, which diverges into 12mm wide and 0.4 mm thick at a distance of 300m), instead of it being a single dot. And that was a problem when I tried to measure distances to small objects, or to small surface features of large objects.
The sensor reports the average distance to the surface being probed. That is, if the surface has depressions and/or potrusions, these will be taken into account after processing and an average of the measured distances will be reported.
After some experimenting, I discovered that the sensor will report the correct distance even if the surface being probed is smaller than the width of the laser line. For instance, if I point the sensor perpendiculary at the edge of a steel plate having a width of 3mm, the left and right edges of the laser line are lost to infinity, but the 3mm beam is more than enough for the sensor to correctly calculate a distance to that surface.
So that gave me an idea. Since what I wanted was to measure distances on surface features no larger than 2mm, I filtered the 9mm beam through a 1mm wide slit formed by the edges of a couple of razor blades, and thus obtained a projected beam of between 1.5 and 2mm wide, depending on distance. And it works beautifully. Except that the end result is not a perfect 1.5mm wide by 0.3 thick projection, but rather a 1.5mm wide projection plus "ghost" edges to its left and right (the 0.3mm thickness is unaffected) that I attribute to natural diffraction. No mystery there...
Question, is there a technique to completely cancel or eliminate (or at least minimize) this diffraction pattern and obtain a clean 1.5mm x 0.3mm laser line projection?