Calculating range accuracy of DMM for given resistor

Thread Starter

van53

Joined Nov 27, 2011
67
Regarding the Fluke 8012A Lo ohms range test: The manual mentions in section 4-26 the test resistor is 1.9 ohms +/- 0.05%.

In section 4-35 it indicates (as part of the test) to verify the display reading is between 1.879 and 1.921. I would like to know how these values are obtained.

Looking at the specifications table 1-2 it mentions under Low Resistance, 2ohms range, the resolution is 1 miliohm and the accuracy is +/-(1% of reading + 2 digits)

The following were my results:

A 1.9 ohm test resistor (0.05% tolerance) could have a minimum value of 1.89905 ohms

1.89905 ohms - 1% = 1.8800595

As the resolution is 1 Milliohm the remaining digits are truncated and not rounded(????) : 1.880 ohms

1.880 - 2 digits = 1.878 ohms

The manual says 1.879 ohms however I have calculated 1.878 ohms. What have I done incorrectly?
(I was able to correctly calculate the upper value of 1.921 ohms using the same method above).

Similarly for the 20 ohms range (10 Miliohm resolution, +/-[0.5%+2 digits]) it mentions the reading should be between 18.88 and 19.12 using a 19mohm (0.05%) resistor. I was able to calculate the upper range of 19.12 ohms correctly however for the lower range I get 18.87 ohms and not 18.88...
 

WBahn

Joined Mar 31, 2012
30,077
The spec is 1% of reading, not 1% of the resistance being read.

The reading is R. The actual resistance could be 1% + 2 digits higher than that, so if R is reading 1.878 Ω, the actual resistance could be as much as 18.71.8 mΩ higher due to the 1% 2 mΩ higher due to the 2 digits, meaning that the actual R could be as high as 1.89878 Ω. But that is below 1.89905 Ω that is the lowest value of the test resistor.
 

Thread Starter

van53

Joined Nov 27, 2011
67
The spec is 1% of reading, not 1% of the resistance being read.
I am confused. To my understanding when it says 1% of reading, I would take that to mean an uncertainty of 1% of the actual resistance being read. For example, In the article by Fluke entitled "Understanding specifications for precision multimeters" on page 3, where it talks about percent of reading + number of digits it says :

"Let’s say you want to measure 10 V on a 20 V range in which
the least significant digit represents 0.0001 V. If the uncertainty for the 20 V range is given as ± (0.003 % + 2 counts) we can calculate the uncertainty in measurement units as:
± ((0.003 % x 10 V + 2 x 0.0001 V) = ± (0.0003 V + 0.0002 V) =
± (0.0005 V) or ± 0.5 mV"

So in the above example, they have taken the actual voltage being read which is 10 V and based on the uncertainty of the reading (.003% + 2 counts) they have come up with the value of ± 0.5 mV. Therefore the meter in this case could read 10V ± 0.5 mV.

Isn't this the same principle that I applied in my original calculations?
 

JoeJester

Joined Apr 26, 2005
4,390
Using your 1.9 ohm standard, the maximum and minimum readings would be 1.919 and 1.880 using the combined tolerances of the meter and the component.

When we consider the 2 counts, that would make it 1.92195 and 1.87805

However, the window for the calibration is less than those extremes ... 1.921 and 1.879 respectively, by 950 microohms.
 

WBahn

Joined Mar 31, 2012
30,077
I don't know that there is a "right" answer here -- the meaning of "percent of reading" is not rigidly defined (as far as I know) and most treatments fail to distinguish between the value BEING read and the value that IS read, instead they use whatever value they happen to be using in their example. But consider that, at the end of the day, you have to start what you know. In their example you give they are starting with a assumed "known" actual value (they are assuming that the voltage is actually exactly 10V). But for practical purposes you almost never know the actual value of what you are measuring, but what you DO know is what value was actually read (i.e., displayed) by your instrument.

For almost all cases the difference is negligible. Also, keep in mind that the 1% or 0.1% or 0.003% or whatever is not an exact figure to begin with. Furthermore, while a performance spec can be considered a hard limit (i.e., if it's outside 1% if fails to meet spec), the spec of 0.05% on the resistor value is a statistical measure and they are really just saying that some very high fraction of the resistors with that marking are actually within that tolerance, but not all them are.
 

JoeJester

Joined Apr 26, 2005
4,390
I'm sure the price of a standard resistor with a 0.0005 plus or minus tolerance factor demands a pretty penny. The right answer is whatever the manufacturer states for their device to be within "calibration". Those reading would indicate a perfect 1.9 ohm resistor plus or minus 1.11% The 0.11% is the 2 count.
 

WBahn

Joined Mar 31, 2012
30,077
Also, your instrument might pass a test like this and but still be out of spec.

Let's take a real simple example. You have a 100 Ω resistor that is guaranteed to be within 1%. You have a meter that is guaranteed to be within 0.1%. You measure the resistor with your meter and you get 100.0 Ω. Does the meter meet spec? Well, what if that resistor was 100.5 Ω, which is well within it's tolerance limit? In this case the meter is reading ~0.5% too low, meaning that it is WAY out of spec.

If you really want to be sure you are within spec, you have to make your test bound by the conjunction of the limits.

For example, let's take a 100 Ω resistor that is guaranteed to be within 0.1%, so it can vary between 99.9 Ω and 100.1 Ω. Now let's say your meter is supposed have an error no greater than 0.2%. Well, that's basically 0.2Ω. So if you get a resistor that is 100.0 Ω and your meter reads 100.2 Ω you are within spec. But if that resistor was actually 99.9 Ω then you are out of spec. And you don't know what the actual value of the resistor is. So what you really need to do is set your standards on the range of values that your meter can read and be in spec regardless of what where the actual value being measured falls within its tolerance range.

Out of curiosity, let's see what this approach would give, ignoring the fine distinction between % of value and % of reading.

You have a 1.9 Ω resistor ±0.05% and a meter rating of ±1% + 2 digits.

The value of the resistor could be anywhere between and 1.89905 Ω and 1.90095 Ω.

If the resistor is at the low end than any reading between 1.8780595 Ω and 1.9200405 Ω would be in spec.

If the resistor is at the high end than any reading between 1.8799405 Ω and 1.9219595 Ω would be in spec.

Given a random resistor in that range, your measured value needs to be within the conjunction (the overlap) of these two ranges, which means:

1.8799405 Ω <= reading < = 1.9200405 Ω

With a reading limited to 1 mΩ, we must limit the extents while remaining strictly within this range:

1.880 Ω <= reading < = 1.920 Ω

Clearly that is not what Fluke did. This is probably because they opted to pass meters that were actually just slightly out of spec versus failing meters that were actually just in spec.
 

JoeJester

Joined Apr 26, 2005
4,390
If you take the plus 2 counts literally, then there is something amiss. The numbers specified in the calibration are the standard resistor's printed value times the meter's tolerance plus two and the lower side is minus the meter's tolerance minus 2.

So, I would take the "+2 counts" as plus or minus 2 counts. I'd have to check some other manufacturer's calibrations to be 99.99 percent sure.
 

WBahn

Joined Mar 31, 2012
30,077
This is what was stated in the OP:

+/-(1% of reading + 2 digits)

Since resistance measurements are always positive, the "reading" is always a positive value and the term in parentheses is always increased in magnitude by adding 2 digits. It is the result of this that is then qualified by the "plus/minus".

It would be interesting to see how they represent it for the voltage and current measurements since the "reading" there can be either positive or negative.
 

JoeJester

Joined Apr 26, 2005
4,390
WBahn,

Yes the OP stated exactly what is printed in the manual for the specified meter. The calibration figures were also in the same manual. I agree there is a disconnect between what is printed (and stated) and what the calibration requires.

http://www.aptsources.com/resources...ifference Between Resolution and Accuracy.pdf

has an example that is consistent with the calibration standard specified in the manual.

In essence

Standard Resistor Value times the tolerance plus minimum resolution times the count number.

Then add and subtract that result from the standard value.
 
Last edited:

WBahn

Joined Mar 31, 2012
30,077
WBahn,

Yes the OP stated exactly what is printed in the manual for the specified meter. The calibration figures were also in the same manual. I agree there is a disconnect between what is printed (and stated) and what the calibration requires.

http://www.aptsources.com/resources/pdf/Understanding the Difference Between Resolution and Accuracy.pdf

has an example that is consistent with the calibration standard specified in the manual.

In essence

Standard Resistor Value times the tolerance plus minimum resolution times the count number.

Then add and subtract that result from the standard value.
That document doesn't address the issue. Following their document exactly as written, what would their tolerance figure be if the voltage in their example was -277 V (let's assume that this is on the DC range instead of the AC range)?
 

Thread Starter

van53

Joined Nov 27, 2011
67
I don't know that there is a "right" answer here -- the meaning of "percent of reading" is not rigidly defined (as far as I know) and most treatments fail to distinguish between the value BEING read and the value that IS read, instead they use whatever value they happen to be using in their example. But consider that, at the end of the day, you have to start what you know. In their example you give they are starting with a assumed "known" actual value (they are assuming that the voltage is actually exactly 10V). But for practical purposes you almost never know the actual value of what you are measuring, but what you DO know is what value was actually read (i.e., displayed) by your instrument.
I'm learning about accuracy and resolution, so I don't know myself, however I read up on other manufactures of meters such as BK Precision, Keysight, and Fluke, and they all seem to take a known value and then compute the uncertainty based on it for their examples. Perhaps they do this because it easy to illustrate in an example. You mention in most cases the difference is negligible (value being read vs the value that is read), though I don't disagree, I would think it can't be both ways -- that it must be either uncertainty derived based on a known value being read, or uncertainty derived based on what the meter displays....

I'm sure the price of a standard resistor with a 0.0005 plus or minus tolerance factor demands a pretty penny.
I was have difficulty sourcing a 1.9 ohm 0.05% resistor so I decided to look for other resistors which would have greater tolerances. I wanted to understand how Fluke obtained the min and max values for the Lo Ohms range test so that I could come up with my own min and max ranges for the resistors I choose.

If you really want to be sure you are within spec, you have to make your test bound by the conjunction of the limits.
Thank you! I never thought of this. As I am finding it difficult to source 0.05% tolerance resistors, when I select resistors with greater tolerances I can use the method you describe to ensure I am closer to specifications.

Clearly that is not what Fluke did. This is probably because they opted to pass meters that were actually just slightly out of spec versus failing meters that were actually just in spec.
I contacted The Fluke Calibration Support Team mentioning the same information in the OP and requested details on how they obtained the 1.879 and 1.921 ohm values. They provided the attached excel spreadsheet with their formulas. From our email exchanges it was found that they do not use the resistor minimum and maximum tolerances but instead use the nominal value of 1.9 ohms for arriving at the 1.879 and 1.921 ohm values.

Also with regards to the accuracy and how values may be truncated when dealing with counts, I emailed Fluke Calibration Support with the following question:

For the given accuracy for this meter at the 1.9 ohms range +/- (1% of reading + 2 digits) (resolution 0.001) at what point in the example below should the extra digits past the resolution be excluded from the calculation. For example, lets say you have a hypothetical resistor of 1.89905 ohms and you want to calculate the uncertainty of measurement for this meter:
Step 1: +/-(1% of 1.89905 + 2 * 0.001)
Step 2: +/- (0.0189905 + 0.002)
Step 3: +/-(0.0209905)
Step 4: +/- (0.02)
Is it at step 4 were all digits past the resolution are dropped and we can say that measuring a hypothetical resistor of 1.89905 ohms has an uncertainty of +/- 0.02 ohms?
Their reply was:
You would use normal rounding rules after all calculation have been performed.

Yes at last step.
In terms of what the "last step" is, Fluke mentioned it is "after you subtract and add to the referenced value to get your upper and lower limits." that one would round. It would be normal rounding rules based on the resolution, in this case to the nearest thousandth and that's it. There is no truncating/dropping of digits.

It would be interesting to see how they represent it for the voltage and current measurements since the "reading" there can be either positive or negative.
In the Fluke 8012A manual, table 1-2 for DC volts it indicates the following (I selected the 2V range for example):

Range: +/- 2V
Resolution: 1mV
Accuracy: +/-(0.1% of reading + 1 digit)
 

Attachments

Last edited:

JoeJester

Joined Apr 26, 2005
4,390
I was have difficulty sourcing a 1.9 ohm 0.05% resistor so I decided to look for other resistors which would have greater tolerances. I wanted to understand how Fluke obtained the min and max values for the Lo Ohms range test so that I could come up with my own min and max ranges for the resistors I choose.
Standard resistors, at least those for calibrating are very expensive and they bear zero resemblance to what we know as "resistors".

from http://www.tinsley.co.uk/products/standard-resistors/5615.html



If your looking to buy a standard resistor, you can visit http://www.ietlabs.com/decaderes/resistance-standard.html

of visit ebay

http://www.ebay.com/sch/i.html?_nkw...geo_id=10232&keyword=standard+resistor&crdt=0

I suspect the cost of a standard resistor will exceed the cost of a new meter.

You do have the option of winding your own low value resistance as copper wire has resistance.
 
Last edited:

Thread Starter

van53

Joined Nov 27, 2011
67
Standard resistors, at least those for calibrating are very expensive and they bear zero resemblance to what we know as "resistors".

from http://www.tinsley.co.uk/products/standard-resistors/5615.html



If your looking to buy a standard resistor, you can visit http://www.ietlabs.com/decaderes/resistance-standard.html

of visit ebay

http://www.ebay.com/sch/i.html?_nkw...geo_id=10232&keyword=standard+resistor&crdt=0

I suspect the cost of a standard resistor will exceed the cost of a new meter.

You do have the option of winding your own low value resistance as copper wire has resistance.
I never knew that such standard resistors existed. I always though they were similar in shape to regular resistors with just very low tolerances....

It is interesting that in the manual for the 8012A in figure 4-2 they show how to build the 1.9 ohm and 19 ohm test set. In the picture they show a regular sized resistor affixed to turrets on a special two terminal isolation connector.

I ended up purchasing one that is a dual two terminal isolation connector:

http://www.pomonaelectronics.com/pdf/d_mdpx_1_01.pdf

I am thinking that I may use more than just two resistors so instead of soldering the resistor to the turret directly, I will solder two alligator clips to the turret and hold the resistor this way. This shouldn't introduce too much resistance for hobbyist calibration right?

For resistors I found a seller that has 18 ohms 0.25% and 1.8 ohms 0.5% mil spec from the USSR. I was having difficulty finding such resistors even on digikey in tolerances of 0.25% or 0.5% that are in stock, can be purchased in quantities of one, and are inexpensive..
 

WBahn

Joined Mar 31, 2012
30,077
If this is a "hobbyist calibration", why are you even attempting to calibrate it at all? More specifically, what are the calibration specs that you NEED for doing hobbyist work? It is a waste of time, money, and effort to calibrate your meter (assuming you even have the ability to actually calibrate it at all, as opposed to just checking to see if it is in spec) when it is highly unlikely that you will ever need to make a measurement where the other errors don't swamp the meter's errors even if it's been steadily used and abused for a couple of decades.
 

Thread Starter

van53

Joined Nov 27, 2011
67
If this is a "hobbyist calibration", why are you even attempting to calibrate it at all? More specifically, what are the calibration specs that you NEED for doing hobbyist work? It is a waste of time, money, and effort to calibrate your meter (assuming you even have the ability to actually calibrate it at all, as opposed to just checking to see if it is in spec) when it is highly unlikely that you will ever need to make a measurement where the other errors don't swamp the meter's errors even if it's been steadily used and abused for a couple of decades.
When I recently purchased this meter the AC reading was reading over range on all ranges. I picked up a 8010A parts unit and removed the true RMS U8 module from it and installed it in the 8012A. AC functionality was restored. I had tested out the low ohms feature before and after I did the above change of the U8 module and though I only disconnected the low ohms module, I *think* the reading I got before and after are off (on the low ohms setting). So I wanted to follow the test procedure in the manual which recommends to build a 1.9 ohm and 19 ohm test set to check the low ohms range. It also uses the same test set to set calibration if needed which I figured shouldn't be a problem since I would have already setup the 1.9 ohm and 19 ohm test set to do the actual specification check...

I also figure I could use the test set to check my other 8050A and 8800A to see if those are in specification as well and if not, how far out they are...
 
Last edited:
Top