Thermal Coefficient

Thread Starter

ra1ph

Joined Jan 5, 2010
31
Doing at small project. I’m Verifying the Thermal Coefficient of a metal strip we use to monitor temperature. The metal strip is made from AlCu.

The algorithm we used is
upload_2015-4-1_9-11-12.png

I'm getting different ,alpha value for different temperature ranges.

TEMPERATURE RANGE 22C to 46C , α, of 0.00373/degC


TEMPERATURE RANGE 32C to 46C α, of 0.00365/degC



THIS one confuses me when T0 =32C and the sample is at room temperature:

So, Temperature range is 32C-22C

Using Resistance values above at 32C and 22C I'm getting an average α, of 0.00383/degC.
Can anyone explain this to me?
So why for Temp Range 22-46c I get ~0.00373
for Temp Range 32-46C, I get ~0.00365
and for Temp range 32-22C I get ~0.00383.

Many Thanks
 
Last edited:

WBahn

Joined Mar 31, 2012
29,978
The actual behavior is more complicated than is captured by the simple one-parameter model you are using. In general, you have second, and higher, order effects that you are ignoring. As your temperature range increases, the ability to ignore them diminishes.

Also, in addition to the coefficient being more complicated, you also have to take into account your measurement errors. How well do you REALLY know the temperature of the metal you are measuring? How well do you REALLY know the resistance of the metal you are measuring. If you quantify your measurement errors you may well find that over the range of interest you are working with that the coefficient might well be constant (to a good enough degree) and the problem is that your measurements (particularly your temperature measurements) are the culprit.
 
Top