Something I have been curious about, and will have to account for in some DC to AC applications, including vehicular power and solar, is what power factor on my AC appliances does to my DC source, and whete that lost power goes. My first guess, which I have not gotten a measurement on, is the power becomes heat in the inverter supplying my AC load. (We'll say a motorized tool) Of course, with a motor, peak current happens after peak voltage has dropped back due to inductance. We'll say it is 1.67 amps at 87 volts, but on 120 volt mains, it uses 70 watts. ...but when I cross my (common Autozone style) inverter and go to my battery, power factor doesn't make a hill of beans as the voltage, and consequently the current doesn't change due to inductance. (The inverter may do other things to cause a change in the current instead.) Which calculations usually make the most sense for an inverter? Would I have to account for about 7 amps or so (70 watts @ 12 Volts, plus inverter losses) or would I have to estimate closer to a 17 amp draw (from 1.67 amps @ 120 mains voltage, converted to 12 volts, plus the inverter.)? What makes the inverter work out this way?