I have some questions regarding the use of volt/ohm/amp meters for low voltage work (12V AC).
1. If my meter is calibrated at 120V and has an accuracy of a certain percentage, what is the accuracy of the meter at 12V?
For example, if I have a possible error of +/- 2 volts at 120V, do I still have a potential 2 volt error at 12V? Or is it 0.2V?
2. Does this percentage range (as it varies from 120v to 12v) change according to the quality of the meter, or can I apply the same estimation for a $100 meter vs. a $500 meter?
2. I see meter accuracy displayed as a +/ pecentage plus "x" digits. I understand the percentage, but what are digits? And how do they relate to my question above? And, how do I claculate ranges when given % + digits.
Thanks for your help.
1. If my meter is calibrated at 120V and has an accuracy of a certain percentage, what is the accuracy of the meter at 12V?
For example, if I have a possible error of +/- 2 volts at 120V, do I still have a potential 2 volt error at 12V? Or is it 0.2V?
2. Does this percentage range (as it varies from 120v to 12v) change according to the quality of the meter, or can I apply the same estimation for a $100 meter vs. a $500 meter?
2. I see meter accuracy displayed as a +/ pecentage plus "x" digits. I understand the percentage, but what are digits? And how do they relate to my question above? And, how do I claculate ranges when given % + digits.
Thanks for your help.