Alright, I should be able to do this with no problem, but something about this particular question has me stumped.
A 14KW load is supplied with wires that have a TOTAL resistance of .4ohms. The source voltage is 120VAC. How much power is lossed in the wires alone? The answer is somewhere between 3000-8000 watts.
I have tried calculating the total current by (120V)/(.4ohms) = 300A, then taking that to find the total power, since we know that in a series circuit, the current is the same everywhere. So, Total Power = (300A)(120V) = 36000 watts. Then I subtracted 14000watts from that for 22000wats, but that's not right.
Anyway, I have tried a few other things, but I am stumped on this simple question. I think I'm thinking too hard!
Thanks for the help!
Here is a schematic:
A 14KW load is supplied with wires that have a TOTAL resistance of .4ohms. The source voltage is 120VAC. How much power is lossed in the wires alone? The answer is somewhere between 3000-8000 watts.
I have tried calculating the total current by (120V)/(.4ohms) = 300A, then taking that to find the total power, since we know that in a series circuit, the current is the same everywhere. So, Total Power = (300A)(120V) = 36000 watts. Then I subtracted 14000watts from that for 22000wats, but that's not right.
Anyway, I have tried a few other things, but I am stumped on this simple question. I think I'm thinking too hard!
Thanks for the help!
Here is a schematic:
Attachments
-
21.2 KB Views: 66