Hallo. I found quite an interesting exercise in the book "the art of electronics." Here it is. New york city requires about 10 gigawats of electrical power, at 110 volts. (10 million peolple averaging 1 kilowatt each). Lets calculate what will happen if we try to supply the power through a cable 1 foot in diameter made of pure copper. its resistance is 0.05 microohms per foot. a) How many power is lost in the cable per foot? b) the length of a cable over which you will lose all 10 gigawats. My calculations: I=10GW/110= 90,9MA so 1 foot of a cable dissipates P= 413MW. So i get that about 1 foot of cable is enough to loose all the power. am i right? what's the solution to this, maybe a use of a greater voltage?