Really simple multimeter question

Thread Starter


Joined Mar 1, 2009
Its always kind of thrown me off, so just to be sure I'm measuring correctly...

Using a multimeter to measure current, with the dial set to 200mA, the digital readout says 06.7
Does that mean 67mA?


Joined Oct 24, 2009
I assume you're talking about using a current clamp with your multimeter? If so, that would depend on the output specs of your current probe. Most of the probes I've dealt with put out 1 mA for every 1A measured (1mA/A). I've also seen 100 millivolts/A, 10 mV/A, and others.

Assuming a 1000:1 ratio (1mA/A) current probe, if your meter is on any mA range, you simply read the number as amps. In your case, it would read as 6.7mA


Joined Jul 17, 2007
On 3-1/2 digit meters, the 200mA range setting will measure from 0.0 to 199.9mA.

The display is in mA.

When possible, you're better off measuring voltage across a known resistance than to measure current directly. When measuring current, if you accidentally touch the probes across a low impedance source of current, you will instantly blow the meter's fuse. :(

I have a precision 1 Ohm 50 Watt resistor that I use for such measurements. Since I = E/R, reading voltage across the resistor translates directly to current in Amperes.


Joined Oct 24, 2009
On the other hand, if you're measuring current inline, then what you see is what you get. 6.7 on a 200mA range is 6.7mA. On a 20mA range, you get more accuracy (you might instead read 6.73mA). On a 2A range, you can measure higher currents, but with less accuracy - it would probably show as 1A.