Does more ampere means more heat?

Discussion in 'General Electronics Chat' started by rajat1684, Aug 26, 2013.

  1. rajat1684

    Thread Starter New Member

    Aug 26, 2013
    13
    0
    Dear All,

    I am newbie here.

    I always believe that more ampere means more heat.. for example:

    1) MR 16 Halogen Spot lamp 12 V 35 WATT

    P=VI ; P = 55 Watt, V = 12 V
    Hence,
    I = 55/12 = 4.58 Amps

    2) H3 Halogen head lamp 12 V 55 WATT

    P = VI; P = 35 Watt, V= 12 V
    Hence,
    I = 35/12 = 2.91 Amps

    I believed that current was mainly resistance in halogen lamp. More the resistance hence more the heat.

    So this means 'MR16' should be more hotter then 'H3'.

    This was just my basic understanding, I may be wrong.

    If I am, I will appreciate if someone can guide to right answer.

    Many thanks.
     
  2. #12

    Expert

    Nov 30, 2010
    16,348
    6,836
    You have your labels backwards.

    I can say that when the voltage is the same for both lamps, the higher current and the higher wattage and the higher heat belong to the same lamp. The higher resistance does not belong to that lamp.
     
  3. mcgyvr

    AAC Fanatic!

    Oct 15, 2009
    4,771
    971
    its not really amps..
    its the watts (being dissipated) that creates the heat

    P = EI
    100 Amps at 1v = 100W
    but with only 10 Amps at 50V = 500W = more heat
     
  4. LDC3

    Active Member

    Apr 27, 2013
    920
    160
    You could start by using the correct numbers in the equations.
     
  5. WBahn

    Moderator

    Mar 31, 2012
    17,788
    4,807
    If you push 1A through a 1000Ω resistor you will generate more heat than if you push 10A through a 1Ω resistor. So there is more to it than just "more amps" means "more heat". As mentioned above, it is the power dissipation that counds, which is the product of the voltage across something and the current through that same thing. Now, if you hold the voltage constant, then increasing the current increases the power dissipation.

    If you hold the voltage constant and increase the resistance, then the power dissipation will drop because the current is dropping. But if you hold the current constant and increase the resistance, then the power dissipation will rise because the voltage is increasing. So, here too, it is more than just simply saying that if the resistance goes up the heat generated does this or that.
     
    Last edited: Aug 27, 2013
  6. LDC3

    Active Member

    Apr 27, 2013
    920
    160
    You need to check your math. I get the same number for both.
    P = I^2*R
     
  7. #12

    Expert

    Nov 30, 2010
    16,348
    6,836
    Smile when you say that, pardner.:D
     
  8. moeburn

    Member

    Aug 16, 2013
    31
    0

    No, usually more WATTAGE means more heat, all other specifications being equal. WATTS = VOLTS x AMPS

    For example, if you took a lightbulb, and put 20 watts into it, it will be rougly twice as hot as if you put 10 watts into it.

    HOWEVER, you seem to be comparing different light bulbs! A 50 watt lightbulb by one manufacturer might be JUST AS COOL as a 30 watt lightbulb by another manufacturer, because the 50 watt lightbulb might be more EFFICIENT at using all the watts than the 30 watt lightbulb!

    If the light bulb were 100% efficient, you could put as many watts as you want into it, and it will NEVER get hot, because it is converting all those watts into light! But light bulbs are not perfect, and you lose some watts as heat in the process of converting them into light. That's why a 20 watt LED bulb is SO MUCH BRIGHTER than a 20 watt regular bulb, and yet SO MUCH COOLER, because LED bulbs are more efficient than regular bulbs!
     
  9. WBahn

    Moderator

    Mar 31, 2012
    17,788
    4,807
    Thanks for catching the typo. Originally I had the 1Ω resistor as a 0.1Ω resistor and decided to keep all the numbers as integers, so I increased the 0.1Ω to 1Ω and the 100Ω to 1000Ω, but apparently I didn't get the other 0 in there and since I am still seeing ghost images with my right eye I didn't catch it.
     
  10. rajat1684

    Thread Starter New Member

    Aug 26, 2013
    13
    0
    Thank you all for your reply.

    Apologies on late reply.

    I learned that formula I should be using is P = I^2*R and also more power equals more heat as shown by mcgvyr.
    I understand that no two bulbs are same…hypothetically to understand concept assuming these bulbs have same properties , i.e.
    AC bulb :
    P = 55 Watt with V= 240 Volt and hence I = 55/240 = 0.23 Amps
    P = I^2*R, thus R = 55/(0.23^2) =1039.7 Ohms.
    DC Bulb
    P = 55 Watt with V= 12 Volt and hence I = 55/12 = 4.58 Amps
    P = I^2*R, thus R = 55/(4.58^2) =2.6214 Ohms.
    Resistance is more in ‘AC’ as compare ‘DC’ – Does that mean that ‘AC’ bulb will get more hotter then DC bulb.
    I will appreciate any comments.
     
  11. WBahn

    Moderator

    Mar 31, 2012
    17,788
    4,807
    Power is energy per time.

    There are many kinds of energy. Heat (thermal energy) is just one of them. So it is entirely possible to have System A and System B each consume the same amount of electrical power but have System A get very hot while System B stays cool. All this means is that more of the electrical energy consumed by System A went into heat while more of the electrical energy consumed by System B went into things other than heat. By extention, then, it is entirely possible to have one system consume considerably less electrical energy than another system while getting considerably hotter.

    Now, if we assume that the same fraction of the consumed electrical energy gets converted to heat in the two bulbs, then if they are operating at the same power they will be dissipating the same amount of heat. But heat is not the same thing as temperature. If one bulb is able to dump its heat into the surroundings more easily than the other, then it will not get as hot. Similar to before, it is entirely possible for one bulb to dissipate more heat but remain cooler than another bulb if it can shed that heat quicker.

    The resistance of the two bulbs doesn't come into it. If your are feeding the same power into them, then it is a matter of what fraction of that power is converted to heat and how fast that heat can be carried off into the surroundings.

    Think of it like this: Imagine two pipes standing vertically. You poor water into them at a certain rate, measured in gallons per minute. The total amount of water you have poured into a pipe is how much heat you have dumped into it. The height of the water within the pipe is the temperature of it. If the pipes are identical, then the temperatures of each will always be the same. Now imagine drilling holes in the side of the pipe near the bottom. As the water (heat) flows in at the top, water (heat) is also flowing out through the holes. The higher the water level (temperature), the faster the water will flow out. At some point, the water level will reach a point where the water is flowing out at the same rate that it is flowing in and the water level stops rising. Now imagine drilling more holes, but only in just one of the pipes. Now, for the same water level, the outflow will be greater. This means that level at which the flows balance will be lower and this pipe will stabilize at a lower level (temperature). Up to a point, we can increase the flow rate (power) into this pipe while keeping the level below the equilibrium level that the other pipe stabilized at. Thus, we are dumping more energy into the pipe but the temperature does get as hot.
     
  12. moeburn

    Member

    Aug 16, 2013
    31
    0

    If the two bulbs are equal in all other ways, they should be the same temperature, because they are drawing the same wattage.

    High temperature doesn't just come from lots of amps, and it doesn't just come from resistance. Temperature comes from lots of amps AND volts at the same time.

    Also, someone more knowledgeable in AC than me should correct this, but I don't think you can use 240v in your equations involving a 240v AC electrical outlet, there's this thing called "root-mean-square" that makes calculations involving AC more complicated.
     
  13. #12

    Expert

    Nov 30, 2010
    16,348
    6,836
    The 240 volts coming out of the wall outlet is measured in RMS because it is convenient. That is, the measurement is already converted to RMS. (It makes life so much easier for the electricians.)
     
Loading...