Light bulb wattage

Discussion in 'General Electronics Chat' started by electronewb, Oct 13, 2012.

  1. electronewb

    Thread Starter Member

    Apr 24, 2012
    260
    3
    What makes the wattage rating of a light bulb? Voltage is always the same so would it be the turns and the thickness of the filament and how many amps go through the filament? Why a 20W light bulb can use the same circuitry as a 100W bulb?
     
  2. JohnInTX

    Moderator

    Jun 26, 2012
    2,341
    1,024
    Since voltage is the same, the thing that's controlled is the hot resistance of the filament. Lower resistance means more current and more watts and more light (and mostly more heat!). Since all of the lamps on a circuit are in parallel, they will draw the current dictated by their individual resistances. The wiring in the socket must support the total current draw but most household lighting circuits will supply 15-20 amps.

    Note that I said 'hot' resistance. When cold i.e. lamp is off, the resistance of the filament is much lower resulting in an 'inrush' current of many times the 'hot' current. The copper wiring in your house can handle the transient but the switches must be designed for it. All standard household switches are rated for incandescent loads.

    When you get away from the household switches you must look for a switch rated for incandescent loads that takes the inrush into account.
     
  3. electronewb

    Thread Starter Member

    Apr 24, 2012
    260
    3
    So cold copper has less resistance then hot copper?

    Also what about when a lamp says 60Watt max or a lamp has a rating does it mean that if you put a 100W or higher than it's rating the light bulb it will only produce 60W of energy or the filament in the bulb will melt?
     
  4. takao21203

    Distinguished Member

    Apr 28, 2012
    3,577
    463
    Humans really seem to be hardwired to accept authority in a way of sense if something is printed, they believe it; it has to be true.

    If a lamp is made for a 60W or a 100W bulb really means nothing electrically.

    It might have correlation to a test that was performed for heat developement (and safety class). There is usually a rather large margin for that.

    If you can't estimate all the factors, use the printed rating only.
    If there is any remote fire risk, use the printed rating only.

    turns and the thickness of the filament

    That is actually correct.
     
  5. electronewb

    Thread Starter Member

    Apr 24, 2012
    260
    3
    I just realized something... I have a few work lights that I use in my garage 2 are 500W and other 2 are 1000W so basically I can always buy the 1000 watt bulbs instead of always making sure I have the right wattage for the right light?
     
  6. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    No.
    If you put a 1000W bulb in a lamp rated for only 500W then the lamp might smoke then catch on fire. Or something near the lamp might smoke then catch on fire.

    The same if you put a 100W bulb in a lamp rated for only 60W then the lamp might smoke then catch on fire. Or something near the lamp might smoke then catch on fire.
     
  7. crutschow

    Expert

    Mar 14, 2008
    13,000
    3,229
    Yes, most metal conductors have a resistance that increases with temperature.

    If a lamp fixture says 60W max in means that a higher power lamp will create excess heat in the lamp and could cause a fixture failure or fire. If you put a 100W lamp in the fixture it will draw 100W and operate normally but it may overheat the fixture and should not be done.

    A particular bulb will draw whatever its rated power is at its rated voltage.
     
  8. takao21203

    Distinguished Member

    Apr 28, 2012
    3,577
    463
    The 500W enclosures very likely will overheat. If the 1000W sticks fit into them at all. Doubling the wattage isn't a trivial increase of power.

    I have done things like for instance to squetch 3x 12V halogen bulbs into a small enclosure meant to be for 150W sticks. The enclosure did not heat up to a degree that would pose a fire risk. But they get very hot anyway under normal conditions.

    If you do fit 1000W sticks into a 500W enclosure, you may loose insurance covering if there is a fire.

    100W into a 60W lamp depends on the ventilation. You really need to observe it for a while, and consider the construction of the lamp.
     
  9. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    A show on TV said that fire departments find many homes on fire caused by an old fashioned incandescent light bulb in an over-stuffed closet. The bulb was 60W in a fixture rated for 60W but there was nothing to keep the clothes or plastic from touching the very hot bulb and there was no ventilation to cool the bulb.

    Use a modern and much cooler compact fluorescent light bulb instead.
     
  10. takao21203

    Distinguished Member

    Apr 28, 2012
    3,577
    463
    Yes that is what I mean people put a lot of authority on printed stuff automatically. It also applies in a negated sense as explained above.

    You can in many cases exceed the rating without any risk.
    And on the other hand, there can be concrete situations where even if you follow the printed rating, it will result in a considerable fire risk.

    So what I say is examine the actual situation very carefully. Actually developed heat volume/temperature/materials/ventilation are important. The printed rating is just nominal.
     
  11. electronewb

    Thread Starter Member

    Apr 24, 2012
    260
    3
    What about I put a 500W bulb in the 1000W lamp? Will the bulb light to it's max rating or it will go way over and pop the bulb?
     
  12. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    Yes, but incandescent lamp filaments are typically made of tungsten, which has a much higher melting temperature than copper.
     
  13. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    No, a 500 watt bulb, connected to rated voltage, will always dissipate 500W, regardless of the socket it is plugged (or screwed) into.
    Higher wattage bulbs get hotter than lower wattage bulbs. The wattage rating of a lamp indicates the maximum wattage bulb you can install in it without incurring possible damage to the lamp, or even fire.
     
  14. takao21203

    Distinguished Member

    Apr 28, 2012
    3,577
    463
    The knowledge of the general public about how electricity actually works is interestingly low.

    I have encountered questions like that quite a few times.
     
  15. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    IMHO, volts, amps, and watts are a mystery to at least 99% of the adult population.
     
Loading...