# Voltage Divider Voltage Not Tally

#### sdas86

Joined Mar 18, 2015
26
Hi All,
I have a problem with a voltage divider circuit. The voltage I got from the 1MegaOhm resistor added up is not the same as the voltage measured at the battery. Please refer to the attached image.

I am doing this because I want to measure the voltage at lower voltage and it can show more decimal point on my multimeter. May I know what is wrong?

Thanks

#### Attachments

• 15.4 KB Views: 28

#### ISB123

Joined May 21, 2014
1,236
Could be a problem with tolerances.

#### sdas86

Joined Mar 18, 2015
26
Could be a problem with tolerances.
Hi,
Do you know how I can get accurate readings using voltage divider?

Thanks.

#### crutschow

Joined Mar 14, 2008
29,791
It's the input resistance of your voltmeter that's causing the error.
The equivalent resistance of your voltage divider is 500kΩ. That causes a voltage drop when you attach the resistance of the meter to ground. You are, in effect, adding another voltage divider.

One way to get a perfectly accurate measurement is to use a meter with infinite input resistance.
A low-offset FET-input op amp in a voltage-follower configuration comes close to that ideal.

#### Alec_t

Joined Sep 17, 2013
12,808
If you used 100k resistors instead of 1meg resistors your meter would give a reading closer to what you are expecting, because the meter's own built-in resistance would have less effect.

#### Bordodynov

Joined May 20, 2015
2,993
See

#### Bordodynov

Joined May 20, 2015
2,993
And further

#### Attachments

• 2.3 KB Views: 1

#### ISB123

Joined May 21, 2014
1,236
Its the tolerance.

#### Bordodynov

Joined May 20, 2015
2,993
It scatter the values of resistors and the influence of the input impedance voltmeter!

#### sdas86

Joined Mar 18, 2015
26
Thanks all for the information. I will try low-offset FET-input op amp and 100k resistor as suggested.

#### WBahn

Joined Mar 31, 2012
26,398
Hi All,
I have a problem with a voltage divider circuit. The voltage I got from the 1MegaOhm resistor added up is not the same as the voltage measured at the battery. Please refer to the attached image.

I am doing this because I want to measure the voltage at lower voltage and it can show more decimal point on my multimeter. May I know what is wrong?

Thanks
You are chasing a fool's errand.

You are doing this because you want to get another decimal digit with your meter set on a lower range. Let's assume for the moment that your meter is perfect and has infinite input resistance. What does the tolerance of those two resistors need to be in order for that additional digit to have any meaning? The answer is basically that the two resistors must match to better than 0.07% before you get ANY benefit and to get the full benefit they must match to better than about 0.007%. Good luck achieving that.

Similarly, your two resistors, even if perfectly matched, form a voltage divider with the internal resistance of the source. In order for that not to affect your measurement at the level you are trying to achieve the source resistance would have to be less than 144 Ω (which is not too hard to achieve in most cases, though many small batteries, such as coin cells, are well above this).

Then there is the issue of the basic accuracy of the meter itself. If you look at the specs you will probably find something like it being 0.1% plus so many digits. That, by itself, likely swamps out any benefit.

And all of this before we even begin to consider the effect of the finite input resistance of the meter.

So I would recommend that you start with identifying your NEEDS (not your WANTS). Do you really NEED to know the battery voltage to better than 0.02%? Why? If you really do, then you are in the realm where you need to pay very close attention to even minor error sources and account for them. If you don't, then don't bother and, more importantly, don't fall into the trap of thinking that you know some piece of information to a greater level of accuracy than you really do.