Ceramic Strip heater

Discussion in 'General Electronics Chat' started by superway, Sep 6, 2012.

  1. superway

    Thread Starter Active Member

    Oct 19, 2009
    128
    0
    Hi,

    I have a couple strip heaters which are rated 240vac 500W.
    If the voltage and power are given, so the max current should be loaded about 2.08Aac, then caculate the resistance is about 115 Ohm, is that right? R = V/I

    But when use fluke meter to measure resistance of this heater, it is measured 28.9 Ohm. So it is different resistance to caculation base on rating.

    Can any one explain this difference?

    Thanks
     
  2. paulktreg

    Distinguished Member

    Jun 2, 2008
    611
    120
    The electrical resistance increases with temperature and they may very well rise to 115 Ohm when hot.
     
  3. MikeML

    AAC Fanatic!

    Oct 2, 2009
    5,450
    1,066
    The resistance of most heater elements (and lamp filaments), especially those with metallic elements, is a function of the temperature of the element. Not unusual to have a 400% difference between room temperature and operating temperature.
     
  4. cork_ie

    Member

    Oct 8, 2011
    348
    58
    Most metals or alloys have a resistance (the more correct term is resistivity) that increases with temperature. The value of this increase is termed the temperature coefficient of resistivity. .

    The resistance of your heater elements will be considerably lower at room temperature than at their designed running temp.
     
  5. superway

    Thread Starter Active Member

    Oct 19, 2009
    128
    0
    The resistor I measured by fluke meter is 29.8 Ohm, I am thinking this is actual 120vac rating, not 240Vac as labled on the heater. So I applied 120v on the heater, then I measured the ac current is 3.8 A, the heater is heated very fast. I measured the resistor is still 29.8 Ohm no increasing at all.
    If the resistor rating 240ac, then applied 120vac it will see a low ac current.

    Thanks
     
Loading...