One way to measure thermocouple temperature is to:
This short & less processor intensive
Other is hell of processor intensive include double math from NIST:
Why people prefer method 2. Is there better accuracy?
Or method 1 is wrong correc method is second?
I have on many small processor people usually apply method 1?
This short & less processor intensive
1. Measure hot junction voltage
2. Convert it into equivalent temp from lookup table
3. Measure cold junction temp
4. Add the two
Other is hell of processor intensive include double math from NIST:
1. Measure hot junction voltage
2. Measure cold junction temp
3. Calculate the cold junction equivalent thermocouple voltage using the NIST temperature-to-voltage coefficients
4. Add the cold junction equivalent thermocouple voltage calculated in step 3 to the thermocouple voltage calculated in step 1.
5. Use the result of step 4 and the NIST voltage-to-temperature coefficients (the inverse coefficients) to calculate the cold-junction-compensated, linearized temperature value.
Why people prefer method 2. Is there better accuracy?
Or method 1 is wrong correc method is second?
I have on many small processor people usually apply method 1?