What happens to a transformer in case the VA input exceeds the VA rating?

Thread Starter

Samantha Groves

Joined Nov 25, 2023
161
Suppose you have a transformer with a MVA rating of 1kVA and a nominal voltage of 120V.You connect the transformer into a photovoltaic panel which outputs the nominal voltage of your transformer(120V) but outputs 15A with a cosphi = 0.9(of course there is a inverter to convert the DC power of the photovoltaic into AC power)? .Now the transformer is fed with its nominal voltage but the apparent power of the photovoltaic panel is 1800VA or 1.8kVA larger than the VA rating of your transformer.What will happen then?

I tried searching it on reddit but it has so many different answers and there is no juice you can extract from those answers really
 
Last edited:

BobTPH

Joined Jun 5, 2013
11,487
If you connect a transformer to the output of a solar panel (DC) you have shorted the panel and there is no output on the secondary.
 

sagor

Joined Mar 10, 2019
1,049
The answer depends on how much power you are pulling through the transformer. You can have a source capable of higher power, but as long as you only draw 1kva or less through the transformer, nothing bad should happen. If you draw over 1kva, then the transformer will overheat and eventually get damaged. How much damage depends on how much and how fast you overdraw the power...
Your power consumption is based on what your load is drawing, not what the source is "capable" of.
 

panic mode

Joined Oct 10, 2011
4,937
DC/AC inverter will produce AC output. if the voltage matches transformer voltage first step is ok. the second part is the current. if the load draws more power than transformer can handle, transformer will be overloaded. this may be tolerable instantaneously but not continuously. if continuous, transformer will melt and burn.
 

MisterBill2

Joined Jan 23, 2018
27,319
If you connect a one KVA transformer to the same AC voltage output of a solar power system, and there is no load on the transformer secondary, the power to the transformer will be much less, as will the primary current. The only way to cause the VA input to exceed the rated value is to draw load current in excess of the rated load capacity. What will happen in that case is that the transformer temperature will rise above the specified maximum.
 

MrAl

Joined Jun 17, 2014
13,684
Suppose you have a transformer with a MVA rating of 1kVA and a nominal voltage of 120V.You connect the transformer into a photovoltaic panel which outputs the nominal voltage of your transformer(120V) but outputs 15A with a cosphi = 0.9(of course there is a inverter to convert the DC power of the photovoltaic into AC power)? .Now the transformer is fed with its nominal voltage but the apparent power of the photovoltaic panel is 1800VA or 1.8kVA larger than the VA rating of your transformer.What will happen then?

I tried searching it on reddit but it has so many different answers and there is no juice you can extract from those answers really
Hello,

Short answer is, it overheats and can melt the insulation layers thus ruining the transformer forever.
 

WBahn

Joined Mar 31, 2012
32,760
Does not matter how big is power of photovoltaic panel.
While transformer is fed by it's nominal voltage (120 VAC), all is good.
Until you try to use it to power something that the panel output is capable of but is beyond the transformer rating.
 

Ian0

Joined Aug 7, 2020
13,114
Suppose you have a transformer with a MVA rating of 1kVA and a nominal voltage of 120V.You connect the transformer into a photovoltaic panel which outputs the nominal voltage of your transformer(120V) but outputs 15A with a cosphi = 0.9(of course there is a inverter to convert the DC power of the photovoltaic into AC power)? .Now the transformer is fed with its nominal voltage but the apparent power of the photovoltaic panel is 1800VA or 1.8kVA larger than the VA rating of your transformer.What will happen then?

I tried searching it on reddit but it has so many different answers and there is no juice you can extract from those answers really
It will get warm.
Running a 1kVA transformer at 1.8kVA is quite common for short periods of time. For instance, power tool transformers are generally rated at 3 times their actual VA, because they only get used intermittently.
Transformer in Victron inverters are run at about that ratio. A 5kVA Victron inverter has two 1.5kVA transformers, but the inverter has fan cooling and a temperature sensor built into the transformer. It is common practice as the transformer will withstand a brief overload, but using a smaller transformer cuts the core losses.
Similarly transformers in audio power amplifiers are also overrun (theoretically) as music has a high crest factor meaning that the long term average power is not exceeded.
Transformer power is a thermal rating.
On the other hand, transformer voltage isn't - exceeding the rated VOLTAGE will lead to saturation of the core, massive overheating and failure.
 

MisterBill2

Joined Jan 23, 2018
27,319
Hello,

Short answer is, it overheats and can melt the insulation layers thus ruining the transformer forever.
First, Redit apparently has no more clue than any stray cat! At least not in this area.
The power rating of a transformer is the load output power, usually for constant duty, but sometimes for sort-term loading. Consider any transformer intended to connect to your mains power supply. THE CAPABILITY OF THE MAINS source vastly exceeds the rating of your transformer, and yet the transformer does not even get hot unless you are using it CLOSE TO IT'S OUTPUT RATINGS.
The power into any load is controlled by the load, not by the source, assuming that the source voltage is the correct load input voltage.
 

MrAl

Joined Jun 17, 2014
13,684
First, Redit apparently has no more clue than any stray cat! At least not in this area.
The power rating of a transformer is the load output power, usually for constant duty, but sometimes for sort-term loading. Consider any transformer intended to connect to your mains power supply. THE CAPABILITY OF THE MAINS source vastly exceeds the rating of your transformer, and yet the transformer does not even get hot unless you are using it CLOSE TO IT'S OUTPUT RATINGS.
The power into any load is controlled by the load, not by the source, assuming that the source voltage is the correct load input voltage.
Hello,

Did I imply that an input voltage of 120vac would damage the unit? That was not what I had actually meant.

If the input voltage to the transformer stays at 120vac then there should be minimal heating unless the load is too heavy.
The only thing to consider then is the input voltage that might change due to max power tracking. Max power tracking means the controller will try to provide the maximum power available from the solar array, and that could have various effects like a higher output voltage. However, I find it hard to imagine that a designer would allow a voltage greater than 120vac output if the converter was intended to provide 120vac to mimic the mains power line.

The long and the short of it is that any question in this area should be followed by a test. That's why we take measurements. Using a voltmeter monitor the 120vac output of the solar panel + converter to make sure the voltage stays around 120vac. The limits should be 108vac to 132vac, and I would take those two as absolute worst cases.

My advice to the original poster person who first asked about this:
If there is no AC voltmeter handy, then purchase one immediately, then make the measurements.
 

MisterBill2

Joined Jan 23, 2018
27,319
My response in post#16 was based on the AC supply voltage being reasonably stable. BUT even with a non-regulated AC source, the power input to a transformer is primarily determined by the output load current.
A "maximum power" inverter with a non-regulated output would CERTAINLY NOT be suitable for powering any voltage sensitive load, including a non-regulated battery charge system.
My impression is that most "maximum power" inverters adjust their ratio to provide the required output VOLTAGE.
An inverter that has the output voltage vary with the input voltage is by no means a regulated max power device.
 

MrAl

Joined Jun 17, 2014
13,684
My response in post#16 was based on the AC supply voltage being reasonably stable. BUT even with a non-regulated AC source, the power input to a transformer is primarily determined by the output load current.
A "maximum power" inverter with a non-regulated output would CERTAINLY NOT be suitable for powering any voltage sensitive load, including a non-regulated battery charge system.
My impression is that most "maximum power" inverters adjust their ratio to provide the required output VOLTAGE.
An inverter that has the output voltage vary with the input voltage is by no means a regulated max power device.
Hi,

Max power tracking converters would assume there is a mains load, either directly or through a transformer.
The solar panel is max tracking which provides a voltage that is slightly higher than the mains voltage, and puts out a current that it is able to whether it is low with little sunlight or high with a lot of sunlight. In these scenarios the mains line is considered a low impedance load not a source, so no matter how much current you inject into the line the voltage of the mains is still relatively constant. The secondary would be connected to the mains line here not the primary.
 
Top