When setting a function generator up to output 10kHz (just as an example), running it through a breadboard and connecting to an oscilloscope with a probe...Adjusting the Time/Div. on the scope to obtain an accurate reading, then switching either the probe or the scope to 10x and readjusting the Time/Div....the only differential should be in the time measurement, correct? We are currently working on basic fundamentals of the Oscilloscope, Function Generator and Probe. Trying different frequencies, we are supposed to be changing the Time/Div. adjustment and both the 1x and 10x of the probe, then calculating the Time and Frequency of the waveform. When setting the probe to 10x, the formula for Frequency should be: Time divided by 10, then 1 over that calculaton, right? I'm coming out with the same frequency calculation as what I obtained from the 1x setting on the probe, which is what I assume we are supposed to be getting, but then we are being asked to explain any differences in the Time and/or Frequency measurements. Again, the only thing that should be different is the time, and the frequency calculation should remain the same, right?