Does the output voltage of a transformer, affect the power drawn by the load?

Thread Starter

babaliaris

Joined Nov 19, 2019
160
Even though I have already seen the theory behind transformers, at that age I didn't really know what was going on and probably missed this explanation.

When you connect a load in parallel with an Active Independent Voltage source (If I'm correct) if you increase the voltage of the source the power drawn by the load will increase as well (since the current increases).

Now, if you connect a load to the output of a transformer which is a passive component, I've been told that the more you step up the voltage the current will automatically go down so S = VI (all in magnitudes) remains intact. I was told that in the power transmission lines course.

Question 1: If this is true, then for a known fixed load, If I connect it to a 400V, 200V, or 500V transformer output, it will draw exactly the same S?
Or maybe it will draw the same P but a different Q?

Question 2: If this is true, how does the input voltage (transformer's output voltage) affect the load?

A: "The power drawn is only dependent on the load". This is what I've been told...

B: But I believe that "The power drawn, depends on the load and its input voltage".
 

Ian0

Joined Aug 7, 2020
9,668
Now, if you connect a load to the output of a transformer which is a passive component, I've been told that the more you step up the voltage the current will automatically go down so S = VI (all in magnitudes) remains intact.
That applies to all the various different designs of transformer of a given size. If you increase the number of secondary turns, then the secondary voltage will increase, but, for a given frame size, the amount of power available before it overheats remains constant, so the amount of available current reduces.

However, it you take an already-existing transformer, and increase the primary voltage, the secondary voltage will also increase, as will the current taken by the load (assuming that the load is passive), so the power increases.
The secondary voltage appears as a voltage source of Vp/n (where Vp is the primary voltage and n is the turns ratio) in series with some parasitic resistance and inductance.
If you increase the voltage too far, then the magnetics saturates, but that's another story.
 

Thread Starter

babaliaris

Joined Nov 19, 2019
160
That applies to all the various different designs of transformer of a given size. If you increase the number of secondary turns, then the secondary voltage will increase, but, for a given frame size, the amount of power available before it overheats remains constant, so the amount of available current reduces.

However, it you take an already-existing transformer, and increase the primary voltage, the secondary voltage will also increase, as will the current taken by the load (assuming that the load is passive), so the power increases.
The secondary voltage appears as a voltage source of Vp/n (where Vp is the primary voltage and n is the turns ratio) in series with some parasitic resistance and inductance.
If you increase the voltage too far, then the magnetics saturates, but that's another story.
Now that makes sense!

So in other words, if the primary voltage does not change (but we only change the number of secondary turns), the output voltage of the transformer will indeed increase, but the load will keep drawing the same power because the current is limited right?
 

crutschow

Joined Mar 14, 2008
34,283
I've been told that the more you step up the voltage the current will automatically go down
For a transformer for a given rating, it's as Ian0 stated,
But for the current from the large transformer providing your house voltage, the current is not limited (until the breaker opens, of course), so the current to a fixed resistive load will go up.
 

Ian0

Joined Aug 7, 2020
9,668
Also, its not that easy to adjust the turns ratio on the transformer supplying your house. If you try it, wear insulating gloves, and if it's on a pole, don't fall off the ladder.
But seriously, if you could, you would be able to adjust your 230V 80A supply. Adding a few more turns would get you 240V, but you would be limited to 77A, and taking a few turn off would give you 220V, but you could get 84A before you overheated the transformer or blew the primary fuse and had to call out the electricity company to replace it.
 

Thread Starter

babaliaris

Joined Nov 19, 2019
160
I'm thinking about this in another way.

If you increase the secondary turns, yes you increase the voltage but also the inductance right?

So Z_secondary = ωL increases. This is why the current is restricted right?
 

Ian0

Joined Aug 7, 2020
9,668
I'm thinking about this in another way.

If you increase the secondary turns, yes you increase the voltage but also the inductance right?

So Z_secondary = ωL increases. This is why the current is restricted right?
There is no electrical restriction to the current. If you increase the secondary turns, the voltage increases. If you keep the same load, the power increases. If the power increases, the transformer overheats and fails.
The reduction in current is only in the specification.
 

MaxHeadRoom

Joined Jul 18, 2013
28,618
I'm thinking about this in another way.

If you increase the secondary turns, yes you increase the voltage but also the inductance right?

So Z_secondary = ωL increases. This is why the current is restricted right?
Not totally, a transformer is constructed with a (K)va power size in mind, If you increase the secondary turns, the voltage increases proportional to the turns increase, but the Va value remains the same.
IOW, the current must be decreased in keeping with this (K)va value.
 

Thread Starter

babaliaris

Joined Nov 19, 2019
160
Half step down:
1.JPG

Then I increased the L2 secondary and everything increased (the voltage across the secondary, the current drawn by the load, and of course the power on the load.):
2.JPG

I used high L1 because LTSpice didn't allow me to simulate without adding Rin. Rin was causing a huge voltage drop no matter the value of Rin, so I used a big L1 to restrict the current and thus the voltage drop across Rin.

So I still don't understand what my professor said, that we use transformers to increase the voltage, and minimize the current. It seems that everything is increasing.
 

Attachments

MrChips

Joined Oct 2, 2009
30,711
Now that makes sense!

So in other words, if the primary voltage does not change (but we only change the number of secondary turns), the output voltage of the transformer will indeed increase, but the load will keep drawing the same power because the current is limited right?
No. You are confusing two different things:
1) The power drawn by the load and
2) the maximum power delivered by the transformer.

Let us discuss item (2) first.
The transformer is designed to deliver a maximum power. The maximum power depends on the physical design of the transformer, primarily, the DC resistance of the windings and the total magnetic flux that the ferrous core can handle.

The load does not have to draw the maximum power. It can draw power lower than the maximum power rating of the transformer. If the transformer is 100% efficient, the input power equals the output power. This is not the maximum power.

If the load tries to draw more than the maximum power the output voltage will be lower and the transformer will overheat.
In general, the output voltage is always be lower than the no-load voltage. This is because of the DC resistance in the windings.

Now for (1), the load only takes the power it requires.
Ohm's Law tells you how much current the load needs:
I = V / R

From this you can calculate the power delivered to the load:

P = I x V
P = I x I x R
P = V x V / R

Again, the output power is equal to the input power for a 100% efficient transformer.
This is not the maximum power that the transformer is designed to deliver.
 

MrChips

Joined Oct 2, 2009
30,711
Now let's work backwards.

If the output voltage is V and the load resistance is R, the current is

I = V / R

If you increase V, the current I increases.
The power drawn is increased.

P = V x V / R
P = I x I x R

For 100% efficient transformer, the input power is the same as the output power.

Hence, if you increase the turns ratio, i.e. increase the number of turns in the secondary winding, the voltage, current and power will all increase.
 

Thread Starter

babaliaris

Joined Nov 19, 2019
160
So in the end, how do transformers in three-phase power lines reduce the current in the lines and make it possible to transfer the power over long distances?

This is what I'm trying to understand.

Is the transfer circuit configuration that makes it work like that?
For example, you have a step-up transformer then the lines, and then a step-down transformer. Maybe this specific circuit is what makes it possible.
 

MrChips

Joined Oct 2, 2009
30,711
So in the end, how do transformers in three-phase power lines reduce the current in the lines and make it possible to transfer the power over long distances?

This is what I'm trying to understand.

Is the transfer circuit configuration that makes it work like that?
For example, you have a step-up transformer then the lines, and then a step-down transformer. Maybe this specific circuit is what makes it possible.
What makes you think that transformers reduce the current?
Power transformers are designed to increase or decrease the output voltage.
The current is a different matter.

Read over what I wrote.
 

crutschow

Joined Mar 14, 2008
34,283
So in the end, how do transformers in three-phase power lines reduce the current in the lines and make it possible to transfer the power over long distances?
The current over long distance power lines is reduced as the voltage is increased for a given power transfer since P = I * R.
This is to minimize IR losses in the line wire resistance.
Thus transformers at the generator step up the voltage (to several 100kV) to reduce the long line current, and then transformers at the receiving end substation reduce it for transfer over local lines (12KV), and then finally another transformer to reduce it to the main's voltage in your house.
The power transferred is always determined by the loads at the receiving end.
There is never any intentional limiting of the current in the chain of transformers when going from the generator to your house.
 

Thread Starter

babaliaris

Joined Nov 19, 2019
160
What makes you think that transformers reduce the current?
Power transformers are designed to increase or decrease the output voltage.
The current is a different matter.

Read over what I wrote.
Sorry, I mislead you because my phrase was not written correctly, giving you the wrong impression.

Anyway, the idea is that the transformer is manufactured for a specific S (for example 280MVA), so when I increase the voltage then the current needs to be adjusted to maintain that S.

By the way, I did the following in LTspice, and I indeed saw a current reduction in the line:
Before Increasing the voltage dramatically:
1.JPG

After increasing the voltage from 20k to 400k:
2.JPG

But I suppose using LTSpice is not correct for power systems (so I've heard). These ideal transformers probably can deliver as much power as they can.
 

Attachments

MrChips

Joined Oct 2, 2009
30,711
Hopefully you can gain an understanding from the posts provided.

The transformer does not reduce or limit the current.

It is the act of delivering the same power at a higher voltage that makes the current smaller.
It is in the mathematics of power delivered.

P = I x V

For the same power delivered, I x V is the same.
Increase V and I decreases because of mathematics and not because of the transformer. The transformer just facilitates the mathematics.

Again, ignore the maximum power rating of the transformer. This has nothing to do with why the current is lower at higher voltage.
Thus, a rating of 280MVA or any other rating has nothing to do with the current or power being delivered.
 
Top