Transmitting electricity at high voltage reduces the resistive losses ?

Thread Starter

shengwuei

Joined Aug 22, 2008
25
Hi,

I'm talking about electrical power transmission, here is a reference from wikipedia :http://en.wikipedia.org/wiki/Electric_power_transmission

Losses
Transmitting electricity at high voltage reduces the fraction of energy lost to Joule heating. For a given amount of power, a higher voltage reduces the current and thus the resistive losses in the conductor. For example, raising the voltage by a factor of 10 reduces the current by a corresponding factor of 10 and therefore the
losses by a factor of 100, provided the same sized conductors are used in both cases.




But if we consider a simple circuitry like this : (assume Vs as the power company, Zs as the conductor, ZL as the user at home)

the fraction of power loss in Zs should always be " Zs/(Zs+ZL) ", regardless of the current on Zs or ZL.

So what's wrong with it ?
Thanks...
 

SgtWookie

Joined Jul 17, 2007
22,210
You're missing the substations and local transformers.

Here in the States, long distance high voltage lines run at many thousands of Volts (I don't even know how high they are.) The power is routed to local substations, where the transformers lower the voltage while increasing the current. The local run lines might be 11kV or 22kV. Right at the residential customer is another power transformer, where the 11kV or 22kV is again transformed down to center-tapped 240VAC at high current.

This minimizes the loss over long distances for the high-strung, high-tension cables, allows the medium voltage lines good efficiency while remaining safe, and the transformers at the residential customer's site take care of the rest.

If there were a more efficient way of doing it, they'd be doing it.
 

thingmaker3

Joined May 16, 2005
5,084
Higher voltage does not reduce the current. Wikipedia is often wrong. Anyone can post anything on Wikipedia, after all. Actual current will equal the line voltage divided by (line resistance + load resistance).

The power transmitted is equal to voltage times current. The line loss is equal to square of the voltage divided by the line resistance, and also equal to square of the current times the line resistance. For any given power, line loss will be less at higher voltage.
 

Thread Starter

shengwuei

Joined Aug 22, 2008
25
Hi all,

Thanks for the reply but I'm confused about your explanation.

To SgtWookie / Bill,
I know electrical power runs at high voltage/low current state during long distance transmission by means of using transformers. According to your explanation, it seems it is not correct to simplify the circuit model as my original post to discuss this question. So what is the correct way of understanding this question ?

To thnigmakers3,
I can understand all of your words except the last sentance "For any given power, line loss will be less at higher voltage". Is there any simple but clear way to prove this ?

Thanks again.:)
 

Wendy

Joined Mar 24, 2008
22,155
Thingmaker is correct when he talks in terms of power. If I have a 2 volt power supply going through wire that is one ohm to a one ohm load I'll have 1 amp and the load will recieve 1 watt, and the wiring will disappate 1 watt.

If I have a 200 volts feeding a 40KΩ load I'll still disappate one watt, the current will be .005A, and the wire will disappate 25μW, a big difference. Upping the voltage I was able to deliver all the wattage to the load instead of between the wire and the load.

This is the whole reason we use AC electricity, because transformers make this possible.
 

studiot

Joined Nov 9, 2007
4,998
You are making things far too complicated.

First of all, Sheng your model circuit is inappropriate.

Zs is not connectd directly to Zl, but through a transformer. So your impedance equations should include multplication by the turns ratio. Look up reflected impedance for transformers. I haven't done this because in practice the stepping up and down is done in stages sothere are several intervening transformers and ratios to consider.

As an alternative look at things this way

P\(_{loss}\)= I\(^{2}\)R.............1

P\(_{loss}\) = \(\frac{V^{2}}{R}\).........2

Where R is the resistance of the transmission system (not the load)

So for a fixed transmission resistance

Equation 1 tells us that the power lost increases with the square of the current, multiplied by the transmission resistance, R

Equation 2 tells us that the power lost increases with the square of the voltage divided by the transmission resistance, R

These equations further tell us that the power loss due to any increase in current will be multiplied by the resistance, whereas any increase in power loss due to an increase in voltage will be divided by the resistance.

So the more we increase the current the more we multiply the power loss, the more we decrease the current and hence increase the voltage the more we divide the power loss.

I have tried to show this by logical reasoning; it is of course even more straightforward to use partial differentiation on the power equations to show this more mathematically.
 

Thread Starter

shengwuei

Joined Aug 22, 2008
25
Hi Studiot,

this statement is still ambiguous to me : So the more we increase the current the more we multiply the power loss, the more we decrease the current and hence increase the voltage the more we divide the power loss.

from the point of view of Equation 2, the more we increase the voltage, the higher power loss we get, the result is propotional to the square of voltage, given a fixed R. I cannot understand what's the meaning of "the more we divide the power loss"

Hope I am not hypercritic, just want to make it clear ... many thanks.
 

Ratch

Joined Mar 20, 2007
1,070
shengwuei,

Hope I am not hypercritic, just want to make it clear ... many thanks
Well, let's take an example. Suppose we have a generator that outputs 11 volts, a tranmission line of 1 ohm, and a load of 10 ohms. The current will be 1 amp. Then the power dissipated by the transmission line is (I^2*R) = 1 watt, and the power dissipated by the load will be 10 watts for a 1:10 power ratio. The generator will output 11 watts.

Next double the generator voltage to 22 volts, and increase the load to 43 ohms. Then the current will be 1/2 amp. The power dissipated by the line is (I^2*R) = 1/4 watt, and the power of the load will be 10.75 watts, for a 1:43 power ratio. So the higher you make the voltage of the generator, the less power is dissipated in the transmission line in comparision with the load. The generator will still output 11 watts.


Ratch
 

Thread Starter

shengwuei

Joined Aug 22, 2008
25
Hi Ratch,

I got your point.
For a given power(P) and line/load resistance(Rline and RL), the fraction of power loss on transmission line is :

\(\frac{Rline}{Rline+RL}\)

it is straightforward to see that as RL increased, the fraction reduced so power loss on transmission line is also reduced. And this situation leads to low current / high voltage as the follwoing equations :

I = \(\sqrt{\frac{P}{Rline+RL}}\)

V = \(\sqrt{P\left(Rline+RL\right)}\)

But the presumption of above is P is a fixed value and RL can be adjusted, which is strange to me. In the real situation, I think P is NOT fixed and will depend on how much load is connected on the end of transmission line. (correct me if I were wrong)
If we take the simplest case for discussion as : only one user at the end of transmission line so the load resistance RL is a fix number(just like my first post), is the explanation from you still adequate ?

hope I am not hypercritic and thanks you all ...:)
 

Ratch

Joined Mar 20, 2007
1,070
shengwuei,

But the presumption of above is P is a fixed value and RL can be adjusted, which is strange to me. In the real situation, I think P is NOT fixed and will depend on how much load is connected on the end of transmission line. (correct me if I were wrong)
If we take the simplest case for discussion as : only one user at the end of transmission line so the load resistance RL is a fix number(just like my first post), is the explanation from you still adequate ?
Well, a generator has a fixed upper limit as to the amount of power it can generate whether it is powered by steam or water. They try to keep the voltage constant by keeping the frequency constant. If less power is needed, that means less current and lower line losses. The loads can be kept constant by transformers and who knows what else. A power engineer can tell you more about how they do it.

Ratch
 

Wendy

Joined Mar 24, 2008
22,155
Problem is, the load resistance is not fixed. But if it were, tranformers convert voltages so you can use high voltage over long distances, then drop the voltage down with vertually no loss to whatever the load needs.

Transformers are what make this all possible.
 

Metalfan1185

Joined Sep 12, 2008
161
Thingmaker is correct when he talks in terms of power. If I have a 2 volt power supply going through wire that is one ohm to a one ohm load I'll have 1 amp and the load will recieve 1 watt, and the wiring will disappate 1 watt.

If I have a 200 volts feeding a 40KΩ load I'll still disappate one watt, the current will be .005A, and the wire will disappate 25μW, a big difference. Upping the voltage I was able to deliver all the wattage to the load instead of between the wire and the load.

This is the whole reason we use AC electricity, because transformers make this possible.

and because if we used DC, just think of how big the conductors would have to be!
 
Top