Influence of the oscilloscope's impedance on voltage measurements

Thread Starter

victoria_256

Joined Feb 22, 2025
6
Hi everyone! I am measuring a voltage divider with R₁ = 500 kΩ and R₂ = 1 MΩ, using a sinusoidal input signal Vₑ with an RMS voltage of 3V. According to calculations, the expected RMS voltage at the midpoint of the divider should be 2V, but when measuring with an oscilloscope, I only get 1 mV RMS.

Could this large discrepancy be caused by the oscilloscope's input impedance? In what cases does the measurement device's impedance significantly affect the measurement of high-impedance circuits?
 

Ian0

Joined Aug 7, 2020
13,097
Are you using a x1 probe or a x10 probe?
Every oscilloscope I've ever used has a 1MΩ input impedance. So it is 10MΩ with the x10 probe.
If you are only getting 1mV then the input impedance is not the problem regardless of whether you have it set to AC or DC.
Does it measure the 3V correctly? if not, do you have the input set to ground?
 

Thread Starter

victoria_256

Joined Feb 22, 2025
6
Are you using a x1 probe or a x10 probe?
Every oscilloscope I've ever used has a 1MΩ input impedance. So it is 10MΩ with the x10 probe.
If you are only getting 1mV then the input impedance is not the problem regardless of whether you have it set to AC or DC.
Does it measure the 3V correctly? if not, do you have the input set to ground?
x1. when i connect the probe in the source the measure its correct
 

Ian0

Joined Aug 7, 2020
13,097
x1. when i connect the probe in the source the measure its correct
I'm sure you can work out what the signal voltage should be when the scope puts 1MΩ in parallel with your 1MΩ resistor.
I'm also assuming that this isn't at some high frequency where the capacitance of the scope and its cables are important.
Therefore, either your 1MΩ resistor isn't 1MΩ or something has become disconnected, or your earthing clip is connected in the wrong place.
 
Top