Ceramic Strip heater

Thread Starter

superway

Joined Oct 19, 2009
125
Hi,

I have a couple strip heaters which are rated 240vac 500W.
If the voltage and power are given, so the max current should be loaded about 2.08Aac, then caculate the resistance is about 115 Ohm, is that right? R = V/I

But when use fluke meter to measure resistance of this heater, it is measured 28.9 Ohm. So it is different resistance to caculation base on rating.

Can any one explain this difference?

Thanks
 

MikeML

Joined Oct 2, 2009
5,444
The resistance of most heater elements (and lamp filaments), especially those with metallic elements, is a function of the temperature of the element. Not unusual to have a 400% difference between room temperature and operating temperature.
 

cork_ie

Joined Oct 8, 2011
428
Most metals or alloys have a resistance (the more correct term is resistivity) that increases with temperature. The value of this increase is termed the temperature coefficient of resistivity. .

The resistance of your heater elements will be considerably lower at room temperature than at their designed running temp.
 

Thread Starter

superway

Joined Oct 19, 2009
125
The resistor I measured by fluke meter is 29.8 Ohm, I am thinking this is actual 120vac rating, not 240Vac as labled on the heater. So I applied 120v on the heater, then I measured the ac current is 3.8 A, the heater is heated very fast. I measured the resistor is still 29.8 Ohm no increasing at all.
If the resistor rating 240ac, then applied 120vac it will see a low ac current.

Thanks
 
Top