Questions on math

Thread Starter

Mark135

Joined Jan 26, 2022
3
What is the amount of power used by 500ft of 2 conductor 10 guage wire supplying a 1500W heater rated for 120V?

I came up with 1340.63W and would like a second opinion or some assistance
 

Ya’akov

Joined Jan 27, 2019
9,079
Why are you using 12.5A in the second calculation?

Perhaps you should try to figure out the equivalent series resistance of the circuit and apply Ohm's law to that.
 

Wolframore

Joined Jan 21, 2019
2,609
First current is "I"

10 ga will not be enough for 1000 ft round trip

Your error is reducing by the power consumed by the wire's resistance. You will also get about 12V drop across the wire so the calculation gets messy,
 
Last edited:

crutschow

Joined Mar 14, 2008
34,285
You need to include the resistance of the wire to determine the heater current as it will draw less than its rated power due to the wire resistance..

And I think the question is asking for the power lost in the wire resistance.
 
Last edited:

Wolframore

Joined Jan 21, 2019
2,609
The problem that I see is the voltage drop, keep in mind the resistance of copper will go up as it warms up.
 
Last edited:

Ya’akov

Joined Jan 27, 2019
9,079
You need to include the resistance of the wire to determine the heater current as it will draw less than its rated power due to the wire resistance..

And I think the question is asking for the power lost in the wire resistance.
Being homework I tried to hint at this, above, rather than simply say it.
 

MrAl

Joined Jun 17, 2014
11,396
AWG 10 is 1.02ohm per 1000ft 500ft x 2 conductor is 1000ft
C=p/v 1500W/120V=12.5A
p=I*x R 12.5* x 1.02ohms = 159.37W

1500w - 159.37 = 1340.63W is the correct???
Yes as others have pointed out, unless you are trying to keep it extremely simple you need to calculate the voltage drop too in order to determine the correct current draw.

A simple example:
For a heater that uses 1100 watts at 100 volts the current would be 1100/100=11 amps.
For a heater that uses 1100 watts at 100 volts but also has a 10/11 Ohms (about 0.909 Ohms) resistor in series with it, it will draw just 10 amps.
Now if we calculate the power in the wire without the resistor we would get:
Pw=11^2*(10/11)=110 watts
but if we include the effect of the voltage drop caused by the wire itself we would get:
Pw=10^2*(10/11)=90.9 watts.
So you see the power in the wire is dependent not only on the heater characteristics but also on it's own resistance. In most cases this extra resistance would have to be included in the calculation.

Of course in the real world we would have to account for the effects of high line too, which would mean our original example voltage of 100 volts would rise up to 115 volts for the high line calculation. I doubt you have to do that here though.
 
Top