Oscilloscope sample ratings

Thread Starter

jody

Joined Nov 12, 2012
39
hi all
im a little confused over oscilloscope sample ratings. one scope quotes in ms/s and another in mhz. so what would a 200mhz sample rate be in ms/s.
 

DerStrom8

Joined Feb 20, 2011
2,390
I think Mike is right. It is most likely Ms/S, or "Mega-samples per second". 200 mega-samples per second is the same as 200 Mega-hertz.
 

hexreader

Joined Apr 16, 2011
581
I believe that scopes usually have a greater sampling rate (measures in Megasamples per second) than the bandwidth that is quoted in MHz.

A 100 MHz scope will have an input amplifier capable of handling 100MHz sine wave with minimal distortion.

If you want to view that 100MHz sine wave, then you will need a sampling rate much higher than the input bandwidth to give a reasonably accurate display. If the sampling rate were 100M Sa/s, then you only sample and display once per input cycle. It is pure luck where on the sine wave your sample comes.

My cheap scope has 70MHz bandwidth, but 250M Sa/s.

These are two different specifications and both are important
 

DerStrom8

Joined Feb 20, 2011
2,390
I believe that scopes usually have a greater sampling rate (measures in Megasamples per second) than the bandwidth that is quoted in MHz.

A 100 MHz scope will have an input amplifier capable of handling 100MHz sine wave with minimal distortion.

If you want to view that 100MHz sine wave, then you will need a sampling rate much higher than the input bandwidth to give a reasonably accurate display. If the sampling rate were 100M Sa/s, then you only sample and display once per input cycle. It is pure luck where on the sine wave your sample comes.

My cheap scope has 70MHz bandwidth, but 250M Sa/s.

These are two different specifications and both are important
This is correct, I'm not quite sure what I was thinking. I just bought a Rigol with a 50MHz bandwidth but 1Gs/S. They are completely different specifications and do not mean the same thing.

Ms/S still means "meg-samples per second", though.
 

MrChips

Joined Oct 2, 2009
30,810
No, you are not nit-picking.

The proper SI unit for seconds is s.

Accepted practice for 1 giga samples per second is 1Gs/s.
 

DerStrom8

Joined Feb 20, 2011
2,390
Sorry to nit-pick, but are you sure?

Shouldn't there be a little s for seconds?

I will not argue over case for "samples" though, since this is not exactly an SI abbreviation.
I was trying to remember which "s" was used for seconds vs. samples. I think you're right, samples should be "S" and seconds should be "s", in which case that would be MS/s. And actually, I'm beginning to think they often include a small "a" after the S, so it would be MSa/s. That looks more realistic to me....
 

MrChips

Joined Oct 2, 2009
30,810
S is the SI unit for siemens.

However, Tektronix uses 1GS/s, for example.

I think either 1Gs/s or 1GS/s is acceptable when used as sampling rate specification.

Obviously, this would be dimensionally incorrect in an equation.
 

crutschow

Joined Mar 14, 2008
34,452
The minimum sample rate per the Nyquist criteria is twice the highest frequency of interest. Any frequency higher than that will be "aliased" into the passband as a spurious signal. I have seen this in some early digital oscilloscopes where a sine-wave frequency above the scope sample frequency appeared as a lower frequency sinewave.

To avoid this aliasing, the sample rate of a good digital scope is much higher than two times the scope frequency rating (often ten times or more). Then the input amplifier low-pass filters the signal so that any signal frequencies about 1/2 the sample rate are so small that they produce a negligible aliased signal on the screen.
 
Top