I2R losses

Discussion in 'Homework Help' started by andrew24, Aug 27, 2009.

  1. andrew24

    Thread Starter Active Member

    Aug 20, 2008
    76
    0
    Hallo. I found quite an interesting exercise in the book "the art of electronics."
    Here it is.

    New york city requires about 10 gigawats of electrical power, at 110 volts. (10 million peolple averaging 1 kilowatt each). Lets calculate what will happen if we try to supply the power through a cable 1 foot in diameter made of pure copper.
    its resistance is 0.05 microohms per foot.
    a) How many power is lost in the cable per foot?
    b) the length of a cable over which you will lose all 10 gigawats.

    My calculations:

    I=10GW/110= 90,9MA so 1 foot of a cable dissipates P= 413MW.
    So i get that about 1 foot of cable is enough to loose all the power.
    am i right?
    what's the solution to this, maybe a use of a greater voltage?
     
  2. mik3

    Senior Member

    Feb 4, 2008
    4,846
    63
    To dissipate 10GW you need about 20 feet of wire.

    To reduce heat dissipation you can use a higher voltage system and thus the current will be reduced for the same wattage requirement. Another solution is to use wires with less resistance per foot.
     
  3. andrew24

    Thread Starter Active Member

    Aug 20, 2008
    76
    0
    Thanks. How did you calculate that ?
     
  4. rjenkins

    AAC Fanatic!

    Nov 6, 2005
    1,015
    69
    Your own calcs show one foot of cable would dissipate 413MW; that is roughly 1/20 of 10GW.
    (The losses would actually be somewhat higher due to 'skin effect', but that's another topic..)

    As to voltages, think of why big overhead cable towers have the seriously massive ceramic 'pile of plates' insulators and take it from there.
     
Loading...