Frequency and Timing measurements on an oscilloscope

Thread Starter

paul_alan

Joined Nov 5, 2011
43
When setting a function generator up to output 10kHz (just as an example), running it through a breadboard and connecting to an oscilloscope with a probe...Adjusting the Time/Div. on the scope to obtain an accurate reading, then switching either the probe or the scope to 10x and readjusting the Time/Div....the only differential should be in the time measurement, correct? We are currently working on basic fundamentals of the Oscilloscope, Function Generator and Probe. Trying different frequencies, we are supposed to be changing the Time/Div. adjustment and both the 1x and 10x of the probe, then calculating the Time and Frequency of the waveform. When setting the probe to 10x, the formula for Frequency should be: Time divided by 10, then 1 over that calculaton, right? I'm coming out with the same frequency calculation as what I obtained from the 1x setting on the probe, which is what I assume we are supposed to be getting, but then we are being asked to explain any differences in the Time and/or Frequency measurements. Again, the only thing that should be different is the time, and the frequency calculation should remain the same, right?
 

Jotto

Joined Apr 1, 2011
151
1 over time = freq
1 over freq = time

Using a 10x probe is only allowing you to see it on your scope if the amplitude is more then you can see using the probe in 1x. It doesn't change anything of the measurement.
 

Adjuster

Joined Dec 26, 2010
2,148
The 1X and 10X probes normally used with oscilloscopes scale amplitude (level) only. Timing is not altered by this, other than by differences in frequency response and capacitive loading on the circuit under test, effects which may be negligible at the frequencies you are using.
 

Jotto

Joined Apr 1, 2011
151
Actually it could if the probe is out of cal. Use the cal on the scope an you can tell if there is a difference when measured.
 

BillB3857

Joined Feb 28, 2009
2,570
Even if the probe is out of calibration, it would not change any time measurement. If the compensation were out of adjustment, leading and trailing edges of the calibration square wave would either be over-peaked or rounded. Again, time between cycles would not change due to probe calibration/compensation.
 

Jotto

Joined Apr 1, 2011
151
Guess someone needs to read what is said, I did say that only thing changed is amplitude.

I am talking the probe and I don't know the kind you use, but the kind we use has two adjustments X1 and X10.

Yes you are correct time is not changed.
 

Yako

Joined Nov 24, 2011
245
Even if the probe is out of calibration, it would not change any time measurement. If the compensation were out of adjustment, leading and trailing edges of the calibration square wave would either be over-peaked or rounded. Again, time between cycles would not change due to probe calibration/compensation.
You are talking about a trimmer capacitor adjustment on the probe correct?
 

Jotto

Joined Apr 1, 2011
151
My scope is a bit more then most have. I don't have to worry about freq or time because mine will have a read out for it.

Mine has delta measurements that make it so you don't have to do much but read.

I do have to read elliptical wave forms, amplitude is important for some equipment and is X & y function. I check/calibrate my equipment at least once a week.
 

Jotto

Joined Apr 1, 2011
151
My first scope was a Hewlett Packard, got it for free because it was broken, or should I say it was broken on purpose. I don't know if you remember the metal case transistors that would have voltage riding on the case. Someone shorted two of them together and took out a resistor.

The person who brought me the scope, also had a bunch of scrap boards he gave me, an it had the resistor I needed.

It was a good scope, the only problem was it was a 10 meg scope.
 

Yako

Joined Nov 24, 2011
245
I think a DSO for some reverse engineering would be nice. These protocols for some equipment would be so much easier to figure out.
 

thatoneguy

Joined Feb 19, 2009
6,359
Bit more than most have.
Correcting minor grammatical mistakes is a tad on the rude side.

Also, please multiquote posts using the "+ button at the lower right of each post so you can respond to all of them with one reply.

Posting separate replies to each part of a post, or to several posts, just to get a higher post count is a bit annoying to people trying to get useful information from the thread. This is something you've done in every thread, but I'm only mentioning it here out of politeness.
 

BillB3857

Joined Feb 28, 2009
2,570
You are talking about a trimmer capacitor adjustment on the probe correct?
Yes. The trimmer cap compensates and balances for any differences between what the probe presents and what the scope needs for proper bandwidth coupling to the input.
 

MrChips

Joined Oct 2, 2009
30,720
btw, the x10 scope probe does more than increase the input impedance from 1M to 10M.
The x10 probe presents a lower DC load and a lower AC load (with lower input capacitance). It also allows you to measure higher voltages.
With the extra signal to spare, it can extend the frequency response of the measuring system, for example from 15MHz to 60MHz.
Hence it is important to adjust the trimmer capacitor in order to provide a flat response over the extended range.

www.syscompdesign.com/AppNotes/probes.pdf
 

Attachments

Last edited:

CraigHB

Joined Aug 12, 2011
127
I tend to run my probes in 10x unless I need the higher sensitivity. It reduces the distortion caused by probe loading. A lot of times you're measuring signals that drive high impedance CMOS stuff where probe impedance can cause a bogus measurement.
 
Top