I'm talking about electrical power transmission, here is a reference from wikipedia :http://en.wikipedia.org/wiki/Electric_power_transmission

*Losses*

*Transmitting electricity at high voltage reduces the fraction of energy lost to*

*Joule heating*

*. For a given amount of power, a higher voltage reduces the current and thus the*

*resistive losses*

*in the conductor. For example, raising the voltage by a factor of 10 reduces the current by a corresponding factor of 10 and therefore the
losses by a factor of 100, provided the same sized conductors are used in both cases.*

But if we consider a simple circuitry like this : (assume Vs as the power company, Zs as the conductor, ZL as the user at home)

the fraction of power loss in Zs should always be " Zs/(Zs+ZL) ", regardless of the current on Zs or ZL.

So what's wrong with it ?

Thanks...