Voltage Divider Voltage Not Tally

Thread Starter

sdas86

Joined Mar 18, 2015
26
Hi All,
I have a problem with a voltage divider circuit. The voltage I got from the 1MegaOhm resistor added up is not the same as the voltage measured at the battery. Please refer to the attached image.

I am doing this because I want to measure the voltage at lower voltage and it can show more decimal point on my multimeter. May I know what is wrong?

Thanks
 

Attachments

crutschow

Joined Mar 14, 2008
34,412
It's the input resistance of your voltmeter that's causing the error.
The equivalent resistance of your voltage divider is 500kΩ. That causes a voltage drop when you attach the resistance of the meter to ground. You are, in effect, adding another voltage divider.

One way to get a perfectly accurate measurement is to use a meter with infinite input resistance.
A low-offset FET-input op amp in a voltage-follower configuration comes close to that ideal.
 

Alec_t

Joined Sep 17, 2013
14,313
If you used 100k resistors instead of 1meg resistors your meter would give a reading closer to what you are expecting, because the meter's own built-in resistance would have less effect.
 

WBahn

Joined Mar 31, 2012
30,045
Hi All,
I have a problem with a voltage divider circuit. The voltage I got from the 1MegaOhm resistor added up is not the same as the voltage measured at the battery. Please refer to the attached image.

I am doing this because I want to measure the voltage at lower voltage and it can show more decimal point on my multimeter. May I know what is wrong?

Thanks
You are chasing a fool's errand.

You are doing this because you want to get another decimal digit with your meter set on a lower range. Let's assume for the moment that your meter is perfect and has infinite input resistance. What does the tolerance of those two resistors need to be in order for that additional digit to have any meaning? The answer is basically that the two resistors must match to better than 0.07% before you get ANY benefit and to get the full benefit they must match to better than about 0.007%. Good luck achieving that.

Similarly, your two resistors, even if perfectly matched, form a voltage divider with the internal resistance of the source. In order for that not to affect your measurement at the level you are trying to achieve the source resistance would have to be less than 144 Ω (which is not too hard to achieve in most cases, though many small batteries, such as coin cells, are well above this).

Then there is the issue of the basic accuracy of the meter itself. If you look at the specs you will probably find something like it being 0.1% plus so many digits. That, by itself, likely swamps out any benefit.

And all of this before we even begin to consider the effect of the finite input resistance of the meter.

So I would recommend that you start with identifying your NEEDS (not your WANTS). Do you really NEED to know the battery voltage to better than 0.02%? Why? If you really do, then you are in the realm where you need to pay very close attention to even minor error sources and account for them. If you don't, then don't bother and, more importantly, don't fall into the trap of thinking that you know some piece of information to a greater level of accuracy than you really do.
 
Top