I2R losses

Thread Starter

andrew24

Joined Aug 20, 2008
76
Hallo. I found quite an interesting exercise in the book "the art of electronics."
Here it is.

New york city requires about 10 gigawats of electrical power, at 110 volts. (10 million peolple averaging 1 kilowatt each). Lets calculate what will happen if we try to supply the power through a cable 1 foot in diameter made of pure copper.
its resistance is 0.05 microohms per foot.
a) How many power is lost in the cable per foot?
b) the length of a cable over which you will lose all 10 gigawats.

My calculations:

I=10GW/110= 90,9MA so 1 foot of a cable dissipates P= 413MW.
So i get that about 1 foot of cable is enough to loose all the power.
am i right?
what's the solution to this, maybe a use of a greater voltage?
 

mik3

Joined Feb 4, 2008
4,843
To dissipate 10GW you need about 20 feet of wire.

To reduce heat dissipation you can use a higher voltage system and thus the current will be reduced for the same wattage requirement. Another solution is to use wires with less resistance per foot.
 

Thread Starter

andrew24

Joined Aug 20, 2008
76
To dissipate 10GW you need about 20 feet of wire.

To reduce heat dissipation you can use a higher voltage system and thus the current will be reduced for the same wattage requirement. Another solution is to use wires with less resistance per foot.
Thanks. How did you calculate that ?
 

rjenkins

Joined Nov 6, 2005
1,013
Your own calcs show one foot of cable would dissipate 413MW; that is roughly 1/20 of 10GW.
(The losses would actually be somewhat higher due to 'skin effect', but that's another topic..)

As to voltages, think of why big overhead cable towers have the seriously massive ceramic 'pile of plates' insulators and take it from there.
 
Top