# Frequency and Timing measurements on an oscilloscope

Discussion in 'General Electronics Chat' started by paul_alan, Nov 25, 2011.

1. ### paul_alan Thread Starter Member

Nov 5, 2011
43
0
When setting a function generator up to output 10kHz (just as an example), running it through a breadboard and connecting to an oscilloscope with a probe...Adjusting the Time/Div. on the scope to obtain an accurate reading, then switching either the probe or the scope to 10x and readjusting the Time/Div....the only differential should be in the time measurement, correct? We are currently working on basic fundamentals of the Oscilloscope, Function Generator and Probe. Trying different frequencies, we are supposed to be changing the Time/Div. adjustment and both the 1x and 10x of the probe, then calculating the Time and Frequency of the waveform. When setting the probe to 10x, the formula for Frequency should be: Time divided by 10, then 1 over that calculaton, right? I'm coming out with the same frequency calculation as what I obtained from the 1x setting on the probe, which is what I assume we are supposed to be getting, but then we are being asked to explain any differences in the Time and/or Frequency measurements. Again, the only thing that should be different is the time, and the frequency calculation should remain the same, right?

2. ### Jotto Member

Apr 1, 2011
159
17
1 over time = freq
1 over freq = time

Using a 10x probe is only allowing you to see it on your scope if the amplitude is more then you can see using the probe in 1x. It doesn't change anything of the measurement.

3. ### Adjuster Well-Known Member

Dec 26, 2010
2,147
300
The 1X and 10X probes normally used with oscilloscopes scale amplitude (level) only. Timing is not altered by this, other than by differences in frequency response and capacitive loading on the circuit under test, effects which may be negligible at the frequencies you are using.

4. ### Jotto Member

Apr 1, 2011
159
17
Actually it could if the probe is out of cal. Use the cal on the scope an you can tell if there is a difference when measured.

5. ### BillB3857 Senior Member

Feb 28, 2009
2,402
348
Even if the probe is out of calibration, it would not change any time measurement. If the compensation were out of adjustment, leading and trailing edges of the calibration square wave would either be over-peaked or rounded. Again, time between cycles would not change due to probe calibration/compensation.

6. ### Jotto Member

Apr 1, 2011
159
17
Guess someone needs to read what is said, I did say that only thing changed is amplitude.

I am talking the probe and I don't know the kind you use, but the kind we use has two adjustments X1 and X10.

Yes you are correct time is not changed.

7. ### Yako New Member

Nov 24, 2011
245
2
You are talking about a trimmer capacitor adjustment on the probe correct?

8. ### Yako New Member

Nov 24, 2011
245
2
• ###### cro review.pdf
File size:
364.7 KB
Views:
24
Last edited: Nov 25, 2011
9. ### Jotto Member

Apr 1, 2011
159
17
My scope is a bit more then most have. I don't have to worry about freq or time because mine will have a read out for it.

Mine has delta measurements that make it so you don't have to do much but read.

I do have to read elliptical wave forms, amplitude is important for some equipment and is X & y function. I check/calibrate my equipment at least once a week.

10. ### Yako New Member

Nov 24, 2011
245
2
Bit more than most have.

11. ### Jotto Member

Apr 1, 2011
159
17
My first scope was a Hewlett Packard, got it for free because it was broken, or should I say it was broken on purpose. I don't know if you remember the metal case transistors that would have voltage riding on the case. Someone shorted two of them together and took out a resistor.

The person who brought me the scope, also had a bunch of scrap boards he gave me, an it had the resistor I needed.

It was a good scope, the only problem was it was a 10 meg scope.

12. ### Yako New Member

Nov 24, 2011
245
2
I think a DSO for some reverse engineering would be nice. These protocols for some equipment would be so much easier to figure out.

13. ### thatoneguy AAC Fanatic!

Feb 19, 2009
6,357
718
Correcting minor grammatical mistakes is a tad on the rude side.

Also, please multiquote posts using the "+ button at the lower right of each post so you can respond to all of them with one reply.

Posting separate replies to each part of a post, or to several posts, just to get a higher post count is a bit annoying to people trying to get useful information from the thread. This is something you've done in every thread, but I'm only mentioning it here out of politeness.

14. ### BillB3857 Senior Member

Feb 28, 2009
2,402
348
Yes. The trimmer cap compensates and balances for any differences between what the probe presents and what the scope needs for proper bandwidth coupling to the input.

15. ### MrChips Moderator

Oct 2, 2009
12,652
3,461
btw, the x10 scope probe does more than increase the input impedance from 1M to 10M.
The x10 probe presents a lower DC load and a lower AC load (with lower input capacitance). It also allows you to measure higher voltages.
With the extra signal to spare, it can extend the frequency response of the measuring system, for example from 15MHz to 60MHz.
Hence it is important to adjust the trimmer capacitor in order to provide a flat response over the extended range.

www.syscompdesign.com/AppNotes/probes.pdf

• ###### probes.pdf
File size:
120.3 KB
Views:
21
Last edited: Nov 25, 2011
16. ### CraigHB Member

Aug 12, 2011
127
15
I tend to run my probes in 10x unless I need the higher sensitivity. It reduces the distortion caused by probe loading. A lot of times you're measuring signals that drive high impedance CMOS stuff where probe impedance can cause a bogus measurement.

17. ### MrChips Moderator

Oct 2, 2009
12,652
3,461
I agree. I ALWAYS use my probes in x10.