Power factor through an inverter

Thread Starter

Lectraplayer

Joined Jan 2, 2015
123
Something I have been curious about, and will have to account for in some DC to AC applications, including vehicular power and solar, is what power factor on my AC appliances does to my DC source, and whete that lost power goes. My first guess, which I have not gotten a measurement on, is the power becomes heat in the inverter supplying my AC load. (We'll say a motorized tool) Of course, with a motor, peak current happens after peak voltage has dropped back due to inductance. We'll say it is 1.67 amps at 87 volts, but on 120 volt mains, it uses 70 watts. ...but when I cross my (common Autozone style) inverter and go to my battery, power factor doesn't make a hill of beans as the voltage, and consequently the current doesn't change due to inductance. (The inverter may do other things to cause a change in the current instead.) Which calculations usually make the most sense for an inverter? Would I have to account for about 7 amps or so (70 watts @ 12 Volts, plus inverter losses) or would I have to estimate closer to a 17 amp draw (from 1.67 amps @ 120 mains voltage, converted to 12 volts, plus the inverter.)? What makes the inverter work out this way?
 

MrAl

Joined Jun 17, 2014
11,486
Hello there,

If you have a 120 watt load at 120v that is 1 amp unless the power factor is not unity.
With unity power factor and a 100 percent efficient inverter, there would be a 10 amp draw from a 12.0v battery. With a less efficient inverter (as in real life) the current draw will be more in proportion. The loss due to actual inefficiency is lost as heat.
If the power factor is not unity though, then the operation depends on the design of the inverter. A decent inverter will have large enough input filter caps to handle the out of phase current, and will absorb and deliver power as required by the load. If it is not a decent inverter then more current will be drawn from the battery and the battery will be partly handling the out of phase current so it may discharge more some times than others, and possibly short times when it charges. Thus the out of phase current is recirculated by either the input caps or the battery, but that is not a true 'loss' in the system (although there is more side effect loss because of that action). The true loss, which is much less, is because of the extra current that everything has to handle, including the wiring, transistor drop, transformer windings, etc. That loss, although smaller, is lost as heat also.

So the inefficiency loss is lost as heat, the power factor apparent loss is actually recirculated, and the side effect losses due to higher output out of phase current level is also lost as heat but that is much smaller than most loses.
 
Last edited:

tranzz4md

Joined Apr 10, 2015
310
Lots of inverter schemes out there, although only a few methods dominate the market currently. You must remember that your small solid-state inverter doesn't produce a sine wave output, but your electromagnetic AC motors present that sine-wave load. The inverter is limited both in its current output capabilities, and it's total power handling limitations. What have become known simply as VFDs are rectifier/inverters and have required the use of inductors and/or transformers between the drive and the load in many, many applications precisely because of the non-unity power factors. Output components have been beefed up, etc because of those restrictions in the ampacity of those devices and the resultant high failure rates.

You won't have a "17A" draw, but the instantaneous current draw at some points will only be limited by the inverters actual circuit impedance. Think of the loads electrical characteristics and instantaneous demands as opposed to the nominal capabilities of the power source, which in this case is actually a semiconductor "switch" fed by a "battery".
 
Top