# How to measure resistance for microhotplate at room temperature

Discussion in 'General Electronics Chat' started by kvsingh21, Oct 21, 2008.

1. ### kvsingh21 Thread Starter Active Member

Apr 15, 2008
63
0
Right, i am testing a sensor which has a micro heater built in it. What i have to do is apply voltage across it and use the measurements to determine the temperature using the following formula.

R(T) = R0 [1 + α(T - T0)]

I know the value of α, and T0(room temperature) but somehow need to determine the value of R0(resistance at room temperature) which i can then use to determine temperature with the formula above.
Now the problem is this has to be done without changing the heaters temperature.
I tried to measure resistance using a digital voltmeter (LINK), but it doesnt give a reading.

Now what do you guys suggest i do to determine R0.

Last edited: Oct 21, 2008
2. ### mik3 Senior Member

Feb 4, 2008
4,846
63
What sensor is it? Do you have its part number?

3. ### kvsingh21 Thread Starter Active Member

Apr 15, 2008
63
0
I am afraid its part of initial research, so there is no data on it. Its just a gas sensor with a built in heater. All i know is α value of the heater and room temperature.

4. ### mik3 Senior Member

Feb 4, 2008
4,846
63
As, i can see the equation which describes R(T) is linear, so if you measure R(T) for an arbitrary temperature then you will get the same result after you make the calculation.
To measure the resistance of the sensor put it in your circuit and measure the voltage across it and the current through it for an arbitrary value of temperature. Then divide the voltage by the current to get the sensors resistance.

Thus, you will have R(T) and T and you will have Ro as the only unknown. Plug the values into your equation and solve for Ro to get its value.

5. ### kvsingh21 Thread Starter Active Member

Apr 15, 2008
63
0
Well the problem is, how would i know what the arbitrary temperature is? Heater is too small to measure temp by other methods. I mean if i do it at room temperature, once i apply voltage across it, the temp will go up so again cant tell the temp.

6. ### beenthere Retired Moderator

Apr 20, 2004
15,815
283
You can always compensate by using an ice bath for one end point and boiling water for the other. Heating from an ohmmeter isn't likely to throw those measurements off.

For that matter, you can always put the meter on the heater and see if the resistance changes. That would indicate some measurement induced heating. If you can expose the heater, affix a metal plate to it to increase thermal mass. Use a small thermistor to track temperature independently. There's always some way to get there.

7. ### kvsingh21 Thread Starter Active Member

Apr 15, 2008
63
0
Yea thats what i thought for ohmmeter, but for some strange reason it dont give me any reading at all.(just says -1 on it). Can you shed any light on that?

I am afraid that wont be a possibility since the heater and the sensor are bonded onto a 16 pin chip. And btw heater has radius of about 5μm.

Last edited: Oct 21, 2008
8. ### beenthere Retired Moderator

Apr 20, 2004
15,815
283
You might want to look into what it is you are measuring. When a digital ohmmeter gives an anomalous reading, it is usually because it is tied to an active source which is driving current. Does the meter read a voltage in the lowest scale? Could you be looking at a tiny thermocouple?

9. ### Wendy Moderator

Mar 24, 2008
20,772
2,540
You could use IR measuring devices, expensive, and they'd require calibration too, but the ice bath and boiling water would be a good start.