Meter calibration

Discussion in 'General Electronics Chat' started by steveparrott, Feb 14, 2006.

1. steveparrott Thread Starter Active Member

Feb 14, 2006
36
0
I have some questions regarding the use of volt/ohm/amp meters for low voltage work (12V AC).

1. If my meter is calibrated at 120V and has an accuracy of a certain percentage, what is the accuracy of the meter at 12V?

For example, if I have a possible error of +/- 2 volts at 120V, do I still have a potential 2 volt error at 12V? Or is it 0.2V?

2. Does this percentage range (as it varies from 120v to 12v) change according to the quality of the meter, or can I apply the same estimation for a \$100 meter vs. a \$500 meter?

2. I see meter accuracy displayed as a +/ pecentage plus "x" digits. I understand the percentage, but what are digits? And how do they relate to my question above? And, how do I claculate ranges when given % + digits.

Thanks for your help.

2. khanh Member

Jan 6, 2006
28
0
If you are using the same meter (of the same error persentage) then that error percentage also appylies to the smaller measurement. Like if you have 120V (+2V)
then you measuare something at 12 v, its error would be(0.2V)

Note: you can scale down, but not too small (compareable to the percentage error)

3. steveparrott Thread Starter Active Member

Feb 14, 2006
36
0
Thanks for the reply, but this doesn't jive with my experience. I know of a test with a moderately priced meter that found an acceptable error at 120v (less than 1%) but at 12v the reading was off by almost 7%.

I'm wondering if there are any published comparison tests of various meters showing results of readings across a range of voltages.

4. JoeJester AAC Fanatic!

Apr 26, 2005
3,373
1,159
Steve,

Look at your meter specifications.

They may say the same error across the ranges [more expensive meters] or they may specify the error at each range.

What is your meter's manufacturer and model number?

5. beenthere Retired Moderator

Apr 20, 2004
15,815
282
Hi,

You always get an error percentage. If the meter is digital, then the +/- digits allow for an apparent dither in the conversion of a voltage that is right at the sensing point for the lowest-order conversion. That can cause a steady voltage to vary every conversion, like 1.012, 1.011, 1.012, etc.

Pretty generally, you get the performance you pay for. The A to D converter has a limited input range, so voltages above that level have to be scaled down by voltage dividers. A .01% resistor is a lot more costly than a similar 1%. Really good meters use laser trimmed dividers that are more costly still. Circuits to give true RMS readings cost more than averaging circuits optimized at 60 Hz and with sine waves only.

6. steveparrott Thread Starter Active Member

Feb 14, 2006
36
0
Ulta-geek,

Thanks for the details. Since you are clearly the master of geek-dom, do you have any knowledge of published comparisons with real measured values for different meters? I searched on the Internet extensively and came up with zip.

Thanks again!

7. beenthere Retired Moderator

Apr 20, 2004
15,815
282
Hi,

I'm not aware of any such comparisons. Mostly, you buy a meter that has the stated accuracy you need. So a 6 1/2 digit bench meter will obviously have better accuracy and finer readings than any 3 1/2 digit model.

By experience, one has few problems buying Fluke meters and Tektronix oscilloscopes (other than Model 475's, that is).

8. JoeJester AAC Fanatic!

Apr 26, 2005
3,373
1,159
I don't recall problems with the 475 Tektronic's scope. I used it many times.

Compared to today's technology, the 475 might not stack up ... but ... I never had a problem with the one I used.