This applies to imaging machines in general, like ultrasound, SONAR, RADAR, seismic imaging, etc.
I can imagine how the most basic of basic u.s. sensor might work, a pulse could be sent out, then a reflection comes back after some time, and at some intensity. So after some circuitry it must have been displayed on an oscilloscope as range on 1 axis, and intensity on the other, like early RADAR.
So how did imaging systems go from that, to being able to interpret something like say a block of wood on a table, and display it as it basically looks ? If u have an array of sensors, I guess a basic image could start to be resolved, like a compound eye.
I'm not considering at all how waves act and move in/out, through materials and what not.
I guess the miniaturization and data processing power is what allows a modern ultrasound machine to show images. I bet it takes a lot of signal processing classes, and then programming classes. Guess I'm anserwing the basics, but I'd like a full teardown video with a lot more explanation on SONAR imaging for example.
I can imagine how the most basic of basic u.s. sensor might work, a pulse could be sent out, then a reflection comes back after some time, and at some intensity. So after some circuitry it must have been displayed on an oscilloscope as range on 1 axis, and intensity on the other, like early RADAR.
So how did imaging systems go from that, to being able to interpret something like say a block of wood on a table, and display it as it basically looks ? If u have an array of sensors, I guess a basic image could start to be resolved, like a compound eye.
I'm not considering at all how waves act and move in/out, through materials and what not.
I guess the miniaturization and data processing power is what allows a modern ultrasound machine to show images. I bet it takes a lot of signal processing classes, and then programming classes. Guess I'm anserwing the basics, but I'd like a full teardown video with a lot more explanation on SONAR imaging for example.