Hi, I have read that some Digital Storage Oscilloscopes DSOs decrease their sample rate depending on the settings of the time/Div.or the signal bandwidth. How can you tell? Also when calculating for example the chances of getting a glitch of 2ns of width random and intermittent, you have to take into account the 10 divisions across the screen plus the sampling rate of the DSO?. Also in this case is it good to use the persistance mode?. I know that Some DSOs have some special fuctions to capture glitches like to wait until the glitch happens, but how can this be if they are limited by their sample rate does this has to do something with the memory I am confused about this. Thanks.
Last edited: