Why do small conductors create heat and potential fires?

Thread Starter

Loz2212

Joined Oct 9, 2018
28
Hi guys,

Why do small conductors create heat and potential fires?

My logic is flawed here.. surely if I had a light bulb and thin wires attaching it to the supply.. surely the added resistance of the wires would add to the overall resistance and limit the current..?

If someone could explain that would be great.
 

cmartinez

Joined Jan 17, 2007
8,257
Max is right ... for example, resistors convert power into heat, that's how they restrict the current flowing through them. So if you apply too large a voltage to a resistor, a larger current will flow through them and a larger amount of heat will be produced. After all, what are an electric stove's heat elements other than very robust resistors?
 

crutschow

Joined Mar 14, 2008
34,468
The smaller the wire, the greater the wire resistance.
And the power dissipated in the wire is I²R where I is the current through the wire, and R is the wire resistance.
So if the wire is too small for the current going through it, it will overheat and possibly cause a fire.

That's why there are fuses/circuit-breakers in series with all house main's wires to interrupt the current if it exceeds the wire's current handling capacity.
 

cmartinez

Joined Jan 17, 2007
8,257
I get that, but surely a 1ohm resistor produces more heat then a 1kohm resistor. Please call me an idiot, just correct me in the process
Both resistors can produce the same amount of heat, but at different current levels. Just remember Ohm's law V=I*R, and for power it would be P = V*I (or P = I²*R, among other equivalents)
 

Thread Starter

Loz2212

Joined Oct 9, 2018
28
I'm still none the wiser. Sorry guys. I just see that formula as by increasing resistance you decrease current to keep the wattage the same.. so surely it decreases proportionally to the conductor size. Missing a big piece here somewhere I know lol.
 

WBahn

Joined Mar 31, 2012
30,077
I get that, but surely a 1ohm resistor produces more heat then a 1kohm resistor. Please call me an idiot, just correct me in the process
Let's look at some reasonable numbers for a real circuit.

Let's use a 120 V outlet and power a resistive load that, when connected directly, consumes 2000 W. That means that that about 16.7 A of current are flowing and the load has a resistance of about 7.2 Ω. Since this device is DESIGNED to handle 2000 W, this is not a problem.

Now let's connect it with 10 feet (one way) of 14 AGW wire, which has about 2.5 Ω of resistance per 1000 ft, so our 20 ft of wire has about 0.05 Ω of resistance. The total circuit resistance is thus 7.25 Ω and the current is reduced very slightly to 16.55 A. The power being dissipated in the connecting wire is about 13.7 W. Now let's use 20 ft of 24 AWG which has about 25 Ω per 1000 ft, yielding 0.5 Ω of resistance. Now the total circuit resistance is 7.7 Ω and the current is reduced a bit more to 15.58A, but the power dissipated in the connecting wire is now 121 W.

What you are missing is that the current is largely dictated by the load and doesn't change much as you increase the resistance of the wire it's being powered by. Thus, since the current remains largely the same, if you double the resistance you come close to doubling the heat generated.
 

crutschow

Joined Mar 14, 2008
34,468
I just see that formula as by increasing resistance you decrease current to keep the wattage the same.. so surely it decreases proportionally to the conductor size.
Your incorrect assumption is that the wire resistance is controlling the current.
But normally the wire resistance is selected to be much less than the load resistance, so it has little effect on the load current.
Thus for a given load current, a smaller wire will dissipate more power.
 
Top