limiting error

Thread Starter

full

Joined May 3, 2014
225
hello
my question :

A voltmeter reading 70V on its 100V range and an ammeter reading 80mA on its 150mA range are used to determine the power dissipated in a resistor. Both instruments are having accuracy limitation of within + 1.5% at full scale deflection. Determine the limiting error of the power.

how I can do my problem?
 

Attachments

WBahn

Joined Mar 31, 2012
30,045
This problem requires a few assumptions, which may or may not be particularly justified but which are needed in light of the lack of otherwise necessary information. First, particularly in the case of the ammeter, we have to assume that the ammeter itself has negligible effect on the current measurement. This is often not the case, but you would need to now the internal resistance of the ammeter in order to take it into account. The same is true for the voltage measurement, but this is almost always not an issue unless dealing with very high impedance circuits. Next, we have to assume that the errors given are truly limits on the error, which is seldom the case. The limits are statistical by nature. But treating them as hard limits is good enough to get a good feel for the quality of the measurement. Third, we have to assume that the errors in the two measurements are independent. If two meters are in use, then this is probably a reasonable assumption, but if the same meter is being used to make both measurements one after the other, then it may well not be.

As JoeJester pointed out, it is common to spec the accuracy of a meter in terms of the percent of full scale reading (also known as percent-of-range). So if you are measuring a 2V signal on a range of 50V full-scale and the meter has a 2% of full scale rating, any measurement made on that range will have error limits of + 1V, so your measurement of a 2V signal is + 50%.

Don't forget to take into account that, at the corners, you have four possibilities for the actual values of the voltage and current. Use whichever pair result in the greatest percent error.
 

WBahn

Joined Mar 31, 2012
30,045
Given the error on the voltage measurement, what is the possible range of values that the true voltage could be within?

Given the error on the current measurement, what is the possible range of values that the true current could be within?

Given these ranges, what is the maximum power that could be the true power? What is the minimum power that could be the true power?

Given this min and max possible true power and the best estimate based on the measurements, what is the limit on the percent error?
 
Top