How to correctly read amps on a Digital Multimeter

Thread Starter

jack63ss

Joined Dec 10, 2020
1
So I am trying to determine the amperage on a battery powered circuit. I have 3 AA batteries in series and when I check them with my new multimeter it shows 0.01-0.02 when set to read 10A, or 21.9 when set to read 200m. So in relation to A, what is it ? Is it really 100-200mA ? If it was 1 amp I would have expected 10A to show 0.10 (1/10 of 10) and 200m to read off the scale. I am trying to figure this out because I would like to replace the 3 batteries with a 5V USB power supply and need to figure out how much to step it down. This is my first digital meter and I want to make sure I am reading it correctly.
Thanks
 

paulktreg

Joined Jun 2, 2008
833
You won't get a very accurate reading on 10A full scale because it's so low. When set to 200m that's 200mA full scale so you are reading 21.9mA.
 

WBahn

Joined Mar 31, 2012
29,932
Assuming you have it hooked up properly, then you just read it directly. On the 10 A scale it means that what you read is in amperes and it will measure up to 10 A. If you but it on the 200 mA scale, then what it reads is in milliamperes and it will read up to 200 mA. So your first reading is at the limits of the resolution on that range and is in the 10 mA to 20 mA range. On the 200 mA range you are reading 21.9 mA. The two readings are consistent with each other, all things considered. The series resistance of the meter is different on the two ranges, so that will affect the readings.
 

twohats

Joined Oct 28, 2015
442
Make sure you have the meter leads inserted correctly when you have finished measuring current.
Failure could kill your new meter, if you take a voltage reading.
Good luck...........
 

PhilTilson

Joined Nov 29, 2009
131
If it was 1 amp I would have expected 10A to show 0.10 (1/10 of 10) and 200m to read off the scale.
I think you have a fundamental misunderstanding here.

When you select the 10A range, it means your meter will be displaying the current in Amps, up to a MAXIMUM of 10 Amps. So if it reads 3.00 then that's a current of 3 Amps. If it reads 0.10, that's 0.1 Amps or 100milliamps(mA). On the 200mA scale, you are displaying milliamps so, as you suggest, a current of 1A would be 'off the scale'. It would also be in danger of blowing the fuse that is normally present in the current-reading circuits in a multimeter, so always work down the scales (10A, 200mA, 20mA, 2mA etc) if you don't know roughly what current you are measuring!

So your readings are consistent. On the 10A scale, a reading fluctuating between 0.01 and 0.02 indicates a current of between 10 and 20mA. The meter fluctuates because there is an intrinsic error of at least one digit when you make a measurement (for example, a true voltage of 5.00 volts could actually show as anything between 4.99 and 5.01 volts). So trying to measure 20mA on the 10A scale will be very inaccurate. Your mA scale reading of 21.9 suggests the true current is between 21.8mA and 22.0 mA (assuming your meter is accurate to one digit - it's probably not quite as good as that - check the specifications!).
 

Deleted member 115935

Joined Dec 31, 1969
0
can you show a sketch as to how your connecting the meter in the circuit,
voltage you measure across a load, current you measure in series with the load
 

OldTech

Joined Jul 24, 2009
8
So I am trying to determine the amperage on a battery powered circuit. I have 3 AA batteries in series and when I check them with my new multimeter it shows 0.01-0.02 when set to read 10A, or 21.9 when set to read 200m. So in relation to A, what is it ? Is it really 100-200mA ? If it was 1 amp I would have expected 10A to show 0.10 (1/10 of 10) and 200m to read off the scale. I am trying to figure this out because I would like to replace the 3 batteries with a 5V USB power supply and need to figure out how much to step it down. This is my first digital meter and I want to make sure I am reading it correctly.
Thanks
Jack,
Your new multimeter has a measurement accuracy specification for each and every function and range. For instance, on the 10A current range, the specification might be something like +/-1% of full scale plus 2 digits. In other words (assuming your meter is a 3-1/2 digit meter), on the 10A range, even while measuring 10.00A, the meter could display a reading between 9.98 and 10.12, or a measurement error of +/- 0.12A, which means that your meter is within its specification on the 10A range. That measurement error affects every magnitude of current that you measure with the meter, and means that the lower the reading on a particular scale, the more inaccurate it becomes. It's not a flaw in your meter, it's just the way accuracy specifications work.
You have to look at your meter's set of specifications and do the math to use it effectively, and get the most accurate results. In general, you want to use the range that displays the most digits to insure that you're getting the most accurate reading.
Cheers,
DaveM
 
Last edited:

AA+

Joined Jan 4, 2014
5
Also note the burden (voltage) that the amp meter places across the circuit. A perfect amp meter would place 0V, 0 Ohms in series, but even some high quality digital amp meters have quite a voltage, which diminishes the true current drawn from low voltage batteries.
Best wishes --- Allen Anway
 
Top