Will DC bulb generate more heat to AC bulb.

Discussion in 'General Electronics Chat' started by rajat1684, Aug 27, 2013.

1. rajat1684 Thread Starter New Member

Aug 26, 2013
13
0

I learned that formula I should be using is P = I^2*R and also more power equals more heat as shown by mcgvyr.
I understand that no two bulbs are samehypothetically to understand concept assuming these bulbs have same properties , i.e.
AC bulb :
P = 55 Watt with V= 240 Volt and hence I = 55/240 = 0.23 Amps
P = I^2*R, thus R = 55/(0.23^2) =1039.7 Ohms.
DC Bulb
P = 55 Watt with V= 12 Volt and hence I = 55/12 = 4.58 Amps
P = I^2*R, thus R = 55/(4.58^2) =2.6214 Ohms.
Resistance is more in AC as compare DC  Does that mean that AC bulb will get more hotter then DC bulb.

2. rajat1684 Thread Starter New Member

Aug 26, 2013
13
0
Thanks WBahn. I am considering this ad hypothetical case where say bulbs are made of same manufacturer and assuming that most of energy of bulb gets converte d to heat

3. Brownout Well-Known Member

Jan 10, 2012
2,375
998
Nope. Any bulb will consume a certain amount of power (55 watts in this example) and convert a portion of that power to light and a portion to heat. If we assume that both bulbs convert the same portion of the 55 watts to heat, then they would be the same temperature, given they have identical heat properties (temp coefficients, etc)

4. wayneh Expert

Sep 9, 2010
12,112
3,039
You've set the problem up to answer the question, by constraining the power rating to the same 55W value. Using the assumptions given by Brownout, the two must perform essentially identically.

But a 55W bulb designed for 220VAC will glow dimly if at all when 12VDC is applied to it, because of the higher resistance. Try it! The filament will be cooler and thus the resistance will be less than when supplied 220VAC, but it still will not do more than glow.

Jul 18, 2013
10,534
2,369
It sounds to me like an 'apples and oranges' question.
Where is 'For a certain DC voltage given a certain resistive load , the wattage dissipated will be identical when powered by the same RMS AC voltage value'. ?
Max.

6. rajat1684 Thread Starter New Member

Aug 26, 2013
13
0
Thanks guys. I understand that heat is proportional to the amount of poweR. As suggested by brownout assuming heat properties heat generated by AC with 240 volt will be same as heat generated by DC 12 volt.

what I was told was more resistance more heat. In this case I am struggling to understand that that howcome high resistance produce in AC (approx 1000 ohms) will produce same heat with low resistance DC (approx 3 ohms).

Apologize and thanks for making it clear so far

7. wayneh Expert

Sep 9, 2010
12,112
3,039
I like the falling water analogy for this one. Waterfall height is like voltage, flow is like...current flow. Resistance comes from pipe diameter.

To run your water wheel at a given power level, you need either a a small flow of water falling from a great height or a lot of water from a small height. You need a bigger pipe (less resistance) to get more current at a low pressure/voltage.

8. Brownout Well-Known Member

Jan 10, 2012
2,375
998
Because heat doesn't care about resistance. It cares about power. If P is power and P=I^R or P=V^2/R, P can be a constant if R changes, if the V^2 or I^2 factor scales.

9. mcgyvr AAC Fanatic!

Oct 15, 2009
4,769
969
yep..again.. its the POWER dissipated (in watts) that creates the heat.. It is directly related to that.. It is NOT directly related to volts or amps or ohms.. Just the proper combination of those that will "dissipate" the most heat (most watts).

Its WATTS man.. WATTS... nothing else is the right answer no matter how many times the question is asked

10. GopherT AAC Fanatic!

Nov 23, 2012
6,031
3,800
More resistance, more heat is a fallacy.

If you have 240 volts and run it through a 240 ohm resistor, you have one amp and 240 watts of heat generated by that current flowing through the resistor

Do it again with a 120 ohm resistor at 240 volts and you have 2 amps flowing. That is 480 watts. Less resistance is more heat (assuming a purely resistive load supplied by perfect conductors which is a pretty fair estimate here).

There are many ways of looking at the comparison, is the phrase you used related to the loses in supply wires? If that is the case, using the same diameter supply wire and same length, you will lose more power as heat on a low voltage system because you are pushing more amps through the same wire and experience a greater voltage drop (power drop) than if you use 230 volts.

11. snowdrifter Member

Aug 13, 2013
43
1
Should be the same amount of heat. You proved that in your original post. Same amount of power

12. rajat1684 Thread Starter New Member

Aug 26, 2013
13
0
Thanks you all.

Intersting..I think I can understand now why more power (watt) means more heat (KJ).

Converting watt into temperature (celcius)

Say 70% of 50 watt converts into heat (hypothetically)
i.e. = 0.7*50 = 35 watt of heat

Bulb is on for 5 minutes = 60*5 = 300 sec

1 watt = 1 joule/ sec, thus 35*300 = 10500 joules = 105 KJ
Specific Heat of Dry Air (Cv) = 0.716 KJ/kg.K
Thus,
105KJ/0.716KJ/kg.K = 146 Kg.K
Density of Air = 1.3 Kg/m3

Thus 146 (Kg.K)/1.3 (Kg/m3) = 112.3 K-m3

This means temperature rise will be 112.3 K per m3 of dry air.

Converting Kelvin to Celsius

Celsius = Kelvin- 273
Celcius = 112.3-273 = -160.7 (taking mode = 160 degree celcius)

Is this correct that temperature rise will be 160 degree celcius?

13. LDC3 Active Member

Apr 27, 2013
920
160
Using Ohm's Law (V = I * R), we can calculate power from 3 equations, depending on what we know:
P = I^2 * R
P = V * I
P = V^2 / R