Why transmission lines are high voltage?

nsaspook

Joined Aug 27, 2009
13,315
A fascinating book on this history is "The Invention That Changed the World":
http://www.amazon.com/The-Invention-That-Changed-World/dp/0684835290
For the short period during WW2 it gave us an advantage to the German technology but I think in the long run the Klystron really enabled precision microwave technology and lead to the invention of the cavity magnetron for transmitters and while using klystrons for receivers.
http://en.wikipedia.org/wiki/Klystron

http://www.slac.stanford.edu/cgi-wrap/getdoc/slac-pub-7731.pdf
 
Last edited:

t_n_k

Joined Mar 6, 2009
5,455
It's not irrelevant on power lines as the size of many transmission line cables are large enough that skin-effect plays a significant role. It's one of the reasons that they use bundled conductors.
Skin effect deals with current flowing on the skin of the conductor, it increases with frequency. This is irrelevant on power lines and even more so on the DC filament voltage of a magnetron
As @cmartinez indicated, the issue of skin effect in powerline conductors has been raised previously.
If the issue of skin effect at powerline frequency is irrelevant, then it is difficult to understand why designers of very high ampacity undergound powerline cables go to the trouble of using the Milliken conductor bundling technique.
I think WBahn has a valid point.
 

Thread Starter

Vorador

Joined Oct 5, 2012
87
There's a somewhat related question I've always wondered about regarding transformer and I feel like I should ask it here while am at it.

When the voltage is stepped down to 220/110v before finally arriving at our houses, the current is stepped up. Suppose the final current value is 20A, but what does this really mean? Does it mean that it gives out 20A of current all the time regardless of the load? I doubt that because Ohm's law! Is it then, I wonder, the max current that can be supplied by that transformer? If I want to run a load that requires 21A (I know that's a highly unrealistic value), what would happen then (assuming the wires don't melt)?
 

MaxHeadRoom

Joined Jul 18, 2013
28,702
A transformer only supplies a current that is demanded of it by the load, no load, no current, 21amps on a 20amp secondary is not going to necessarily overload the circuit, the transformer may runner higher temp if not allowed cooling.
But supposedly there would be a O/L protection device of some kind.
Max.
 

cmartinez

Joined Jan 17, 2007
8,257
A transformer only supplies a current that is demanded of it by the load, no load, no current, 21amps on a 20amp secondary is not going to necessarily overload the circuit, the transformer may runner higher temp if not allowed cooling.
But supposedly there would be a O/L protection device of some kind.
Max.
An ideal transformer would consume no current on its primary winding if its secondary is unloaded, right?... but what's its behavior in the real world? Doesn't the primary winding consume at least 0.1% of its rated current even when the secondary is unloaded?
 
Last edited:

#12

Joined Nov 30, 2010
18,224
I think it varies with the size of the transformer. Once upon a time, I measured a transformer for a power supply rated at 28 volt, 4amps. The idle current of the transformer, unconnected, was 50 ma. That's 0.045% the way I do math.

Bigger transformers probably waste less percentage at idle.
 

Thread Starter

Vorador

Joined Oct 5, 2012
87
A transformer only supplies a current that is demanded of it by the load, no load, no current, 21amps on a 20amp secondary is not going to necessarily overload the circuit, the transformer may runner higher temp if not allowed cooling.
But supposedly there would be a O/L protection device of some kind.
Max.
Thanks a lot, Max! :)
 

ian field

Joined Oct 27, 2012
6,536
I think it varies with the size of the transformer. Once upon a time, I measured a transformer for a power supply rated at 28 volt, 4amps. The idle current of the transformer, unconnected, was 50 ma. That's 0.045% the way I do math.

Bigger transformers probably waste less percentage at idle.
While searching for something else, I found a bunch of e-books on electricity transmission. Apparently the magnetising current for just the core on one of those big substation transformers - first line off the 400kV cables; is enough to power quite a few houses.

Its all a matter of relative scale - you don't take 400kV lines anywhere you don't have to, the transformer losses could be more than the load.

For a small town or village with a little light industry, they probably wouldn't cable more than 132kV, a decent sized farm complex probably only needs 11kV distribution.

At any rate - the transformer losses aren't negligible.
 

cmartinez

Joined Jan 17, 2007
8,257
While searching for something else, I found a bunch of e-books on electricity transmission. Apparently the magnetising current for just the core on one of those big substation transformers - first line off the 400kV cables; is enough to power quite a few houses.

Its all a matter of relative scale - you don't take 400kV lines anywhere you don't have to, the transformer losses could be more than the load.

For a small town or village with a little light industry, they probably wouldn't cable more than 132kV, a decent sized farm complex probably only needs 11kV distribution.

At any rate - the transformer losses aren't negligible.
Then my question stands... what power percentage would those big transformers waste? I'll try to google that and see what I find...
 

cmartinez

Joined Jan 17, 2007
8,257
Let me know if/when you find out - its a sort of idle curiosity thing with me.
Here's what I found:

From the original questioner:
Thanks. Below is the e-mail I received from Jefferson Electric when I asked them about it:
"A transformer consumes power any time it is energized, even when there is no load on the secondary. The "No Load" losses on a 75 KVA transformer would be around 550 - 600 watts. You can turn the unit off when you are not using it, but you have to make sure the fuses or CB can handle the in-rush when you power the unit up."

Using that info, it looks like I could save 25-30 dollars each per month to shut them off nights and weekends.​

According to this site, the equivalent in watts of 75kVA is 60kW. So if we're using 600 Watts, that means that the transformer is wasting around 1% of its rated power when idle... that seems a bit high... for me, at least.
 

nsaspook

Joined Aug 27, 2009
13,315
Here's what I found:
According to this site, the equivalent in watts of 75kVA is 60kW. So if we're using 600 Watts, that means that the transformer is wasting around 1% of its rated power when idle... that seems a bit high... for me, at least.
The primary source of losses in long distance power transmission is resistance in conductors so 1% is on the overall low side for the entire system.
http://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=1270&context=ecetr
 

WBahn

Joined Mar 31, 2012
30,077
IIRC, something like 7% of all the power generated is lost in transmission via one form of loss or another. The utility companies therefore have a pretty big motivation to understand what the majority of these losses come from and how to minimize them overall. Not surprisingly, it involves tradeoffs since reducing the loss from one type may aggravate the loss from other types. Plus, there are other cost factors that have to be taken into account such as equipment capital costs, ongoing maintenance, and right-of-way purchases/leases. Each installation is looked at separately and is also looked at with an eye to the future. Their decision of what to run into a rural area that is expected to stay rural for the next few decades may be very different from what they decide to run into an almost identical area that is expected to grow and develop substantially even if the start of that growth isn't expected to occur or a decade or more. Utility companies think on long time scales (but, of course, also have to balance that against this quarter's stock reports).
 

cmartinez

Joined Jan 17, 2007
8,257
Also radiation losses, one reason for going DC.
Max.
So how does this DC system work, something like this?:
  1. Power is generated in AC
  2. Then it's converted to DC
  3. Travels a long distance through the lines
  4. Is converted back to AC
  5. It's finally distributed
Perhaps, as you say, it would be more efficient... but then again the price of the infrastructure needed would increase as well?
 

cmartinez

Joined Jan 17, 2007
8,257
IIRC, something like 7% of all the power generated is lost in transmission via one form of loss or another. The utility companies therefore have a pretty big motivation to understand what the majority of these losses come from and how to minimize them overall. Not surprisingly, it involves tradeoffs since reducing the loss from one type may aggravate the loss from other types. Plus, there are other cost factors that have to be taken into account such as equipment capital costs, ongoing maintenance, and right-of-way purchases/leases. Each installation is looked at separately and is also looked at with an eye to the future. Their decision of what to run into a rural area that is expected to stay rural for the next few decades may be very different from what they decide to run into an almost identical area that is expected to grow and develop substantially even if the start of that growth isn't expected to occur or a decade or more. Utility companies think on long time scales (but, of course, also have to balance that against this quarter's stock reports).
In one word: foresight
 

WBahn

Joined Mar 31, 2012
30,077
So how does this DC system work, something like this?:
  1. Power is generated in AC
  2. Then it's converted to DC
  3. Travels a long distance through the lines
  4. Is converted back to AC
  5. It's finally distributed
Perhaps, as you say, it would be more efficient... but then again the price of the infrastructure needed would increase as well?
The infrastructure price may or may not increase. In many areas a major cost of running the line is the cost of the towers, the cost of the cable, and the cost of the right-of-way. Using a DC line reduces all of these and sometimes the right-of-way costs alone are a dominant factor. If these savings overshadow the higher capital costs at the ends of the line by enough of a margin, then it might make sense to go that route even if the ongoing efficiencies might argue the other way. It's not an easy set of decisions as you have all kinds of things to consider, including regulatory issues.
 
Top