# difference between VA and Watts?

Discussion in 'General Electronics Chat' started by magnet18, Dec 15, 2011.

1. ### magnet18 Thread Starter Senior Member

Dec 22, 2010
1,232
125
OK, I've been wondering this for awhile
What's the difference between VA and watts?
I know transformers are rated in VA (presumably volts * amps), but houses draw kilowatts, what's the difference??

RRITESH KAKKAR likes this.
2. ### Lundwall_Paul Active Member

Oct 18, 2011
222
19

Watts= VA*the power factor.

Loads generally are not pure resistive. Loads have inductive and/or capacitive properties.
When this happens current and voltage are not in phase.

A pure resistive load both current are in phase. This gives a power factor of 1.

So watts= VA*1.

For other loads Google “power factor calculations”. Just too much to post.

3. ### magnet18 Thread Starter Senior Member

Dec 22, 2010
1,232
125
I see, the capacitance will take a lot of current at first, dropping the voltage? then once the capacitance has been... filled? (right word?) the voltage will rise to it's max.

and something similar happens for inductance?

Am I wrong?

4. ### GetDeviceInfo Senior Member

Jun 7, 2009
1,571
230
The difference is that the terms refer to two complete different things. Watts is power utilized or consumed. VA is a measure of the transformers magnetic coupling which provides a volt/amp ratio from input to output.

5. ### thatoneguy AAC Fanatic!

Feb 19, 2009
6,357
718
A purely inductive load will return 100% of the power to the line, it's just "used" power.

That's why industry has large power correction capacitors to offset the inductive motor loading to make the power closer to V*A.

The problem with inductors returning the current to the power line is the little problem of it not being in phase with the voltage, thus useless. Though many people thought they stumbled on "Free Electricity" when they figure this out, or try to measure it. Hence all the coils and magnets in overunity devices, since they aren't understood.

6. ### crutschow Expert

Mar 14, 2008
13,482
3,368
The thing to remember is that real power is only generated by the part of the voltage and current that are in phase at any instant. Thus the phase shift between voltage and current due to inductance or capacitance will cause the real power to be less than a multiplication of the average measured volts times the average measured amps (VA) gives.

7. ### GetDeviceInfo Senior Member

Jun 7, 2009
1,571
230
although phase shifting and power factor are implicate, it has little to do with the VA rating of a transformer. Such a rating indicates the 'capacity' of the trans. Fully loaded, a transformer approaches unity, but consumes no power (less losses). The load (not the transformation of voltage/current) dictate the resulting phase displacement.

Dec 26, 2010
2,147
300
In some contexts such as AC motors, the difference between VA ratings and wattage is mainly a question of phase angle, so that a power factor is defined as the cosine of the phase angle between voltage and current.

For transformers, the nature of the load waveform may also be very important. This especially so when the transformer is feeding via a rectifier into a large reservoir capacitor. In this case, the current waveform may be sharply peaked, with a narrow conduction angle. As a result, the RMS current may be relatively large compared to the average DC output current.

The transformer windings can only handle the RMS current they are rated for without getting too hot, so the more sharply peaked the waveform, the larger the necessary VA rating needed to deliver a given output power.

9. ### spinnaker AAC Fanatic!

Oct 29, 2009
5,065
1,176
Don't mean to hijack magnet's thread but I have always wondered the same myself.

So from this explanation. the reason transformers and other devices are rated in VA because the watts would depend on the power factor of the load therefore watts really can't be specified by the manufacturer?

10. ### magnet18 Thread Starter Senior Member

Dec 22, 2010
1,232
125
You two seem to say the same thing, and it seems to make sense.

From what I got out of that, you seem to be saying the same thing.
(trig aint my strong point)

You seem to disagree, yet from what I can tell, you are saying approximately the same thing...

Dec 26, 2010
2,147
300
Manufacturers can tell how much VA their products can handle, but how much power that will represent in practice depends on the power factor, which they do not control. It is possible to use up the full rating of a transformer by loading it with a device such as a low loss capacitor, having a very nearly 90° phase angle and dissipating negligible power.

The wattage can be equal to the VA rating if the current is perfectly in phase with the voltage, and the current waveform is the same shape as the (normally sinusoidal) voltage.

Typically, because of power factors of less than one, transformers, cable, and other equipment have to be rated for higher currents than might be worked out from the power required. All these extra VA ratings are costly, so for industrial customers at least supply companies may charge more money if the PF is low.

12. ### magnet18 Thread Starter Senior Member

Dec 22, 2010
1,232
125
that makes sense, thankyou
but what is PF?

Dec 26, 2010
2,147
300
14. ### thatoneguy AAC Fanatic!

Feb 19, 2009
6,357
718
I think PF is a valid abbreviation, since PFC is pretty much universally known as "Power Factor Correction" and is commonly used.

PF is listed to be "Power Factor" at abbreviations.com as well.

15. ### GetDeviceInfo Senior Member

Jun 7, 2009
1,571
230
I misunderstood the question, thinking you were questioning the terms as ratings.

16. ### magnet18 Thread Starter Senior Member

Dec 22, 2010
1,232
125
OK, thankyou everybody, it all makes sense now