difference between VA and Watts?

Thread Starter

magnet18

Joined Dec 22, 2010
1,227
OK, I've been wondering this for awhile
What's the difference between VA and watts?
I know transformers are rated in VA (presumably volts * amps), but houses draw kilowatts, what's the difference??
 

Lundwall_Paul

Joined Oct 18, 2011
236
Simple answer is power factor

Watts= VA*the power factor.

Loads generally are not pure resistive. Loads have inductive and/or capacitive properties.
When this happens current and voltage are not in phase.

A pure resistive load both current are in phase. This gives a power factor of 1.

So watts= VA*1.

For other loads Google “power factor calculations”. Just too much to post.
 

Thread Starter

magnet18

Joined Dec 22, 2010
1,227
I see, the capacitance will take a lot of current at first, dropping the voltage? then once the capacitance has been... filled? (right word?) the voltage will rise to it's max.

and something similar happens for inductance?

Am I wrong?
 

GetDeviceInfo

Joined Jun 7, 2009
2,192
OK, I've been wondering this for awhile
What's the difference between VA and watts?
I know transformers are rated in VA (presumably volts * amps), but houses draw kilowatts, what's the difference??
The difference is that the terms refer to two complete different things. Watts is power utilized or consumed. VA is a measure of the transformers magnetic coupling which provides a volt/amp ratio from input to output.
 

thatoneguy

Joined Feb 19, 2009
6,359
and something similar happens for inductance?

Am I wrong?
A purely inductive load will return 100% of the power to the line, it's just "used" power.

That's why industry has large power correction capacitors to offset the inductive motor loading to make the power closer to V*A.

The problem with inductors returning the current to the power line is the little problem of it not being in phase with the voltage, thus useless. Though many people thought they stumbled on "Free Electricity" when they figure this out, or try to measure it. Hence all the coils and magnets in overunity devices, since they aren't understood.
 

crutschow

Joined Mar 14, 2008
34,283
The thing to remember is that real power is only generated by the part of the voltage and current that are in phase at any instant. Thus the phase shift between voltage and current due to inductance or capacitance will cause the real power to be less than a multiplication of the average measured volts times the average measured amps (VA) gives.
 

GetDeviceInfo

Joined Jun 7, 2009
2,192
although phase shifting and power factor are implicate, it has little to do with the VA rating of a transformer. Such a rating indicates the 'capacity' of the trans. Fully loaded, a transformer approaches unity, but consumes no power (less losses). The load (not the transformation of voltage/current) dictate the resulting phase displacement.
 

Adjuster

Joined Dec 26, 2010
2,148
In some contexts such as AC motors, the difference between VA ratings and wattage is mainly a question of phase angle, so that a power factor is defined as the cosine of the phase angle between voltage and current.

For transformers, the nature of the load waveform may also be very important. This especially so when the transformer is feeding via a rectifier into a large reservoir capacitor. In this case, the current waveform may be sharply peaked, with a narrow conduction angle. As a result, the RMS current may be relatively large compared to the average DC output current.

The transformer windings can only handle the RMS current they are rated for without getting too hot, so the more sharply peaked the waveform, the larger the necessary VA rating needed to deliver a given output power.
 

spinnaker

Joined Oct 29, 2009
7,830
Simple answer is power factor

Watts= VA*the power factor.

Loads generally are not pure resistive. Loads have inductive and/or capacitive properties.
When this happens current and voltage are not in phase.

A pure resistive load both current are in phase. This gives a power factor of 1.

So watts= VA*1.

For other loads Google “power factor calculations”. Just too much to post.
Don't mean to hijack magnet's thread but I have always wondered the same myself.

So from this explanation. the reason transformers and other devices are rated in VA because the watts would depend on the power factor of the load therefore watts really can't be specified by the manufacturer?
 

Thread Starter

magnet18

Joined Dec 22, 2010
1,227
Simple answer is power factor

Watts= VA*the power factor.

Loads generally are not pure resistive. Loads have inductive and/or capacitive properties.
When this happens current and voltage are not in phase.

A pure resistive load both current are in phase. This gives a power factor of 1.

So watts= VA*1.

For other loads Google “power factor calculations”. Just too much to post.
The thing to remember is that real power is only generated by the part of the voltage and current that are in phase at any instant. Thus the phase shift between voltage and current due to inductance or capacitance will cause the real power to be less than a multiplication of the average measured volts times the average measured amps (VA) gives.
You two seem to say the same thing, and it seems to make sense.

In some contexts such as AC motors, the difference between VA ratings and wattage is mainly a question of phase angle, so that a power factor is defined as the cosine of the phase angle between voltage and current.

For transformers, the nature of the load waveform may also be very important. This especially so when the transformer is feeding via a rectifier into a large reservoir capacitor. In this case, the current waveform may be sharply peaked, with a narrow conduction angle. As a result, the RMS current may be relatively large compared to the average DC output current.

The transformer windings can only handle the RMS current they are rated for without getting too hot, so the more sharply peaked the waveform, the larger the necessary VA rating needed to deliver a given output power.
From what I got out of that, you seem to be saying the same thing.
(trig aint my strong point)

The difference is that the terms refer to two complete different things. Watts is power utilized or consumed. VA is a measure of the transformers magnetic coupling which provides a volt/amp ratio from input to output.
-------------------------------------------------------
although phase shifting and power factor are implicate, it has little to do with the VA rating of a transformer. Such a rating indicates the 'capacity' of the trans. Fully loaded, a transformer approaches unity, but consumes no power (less losses). The load (not the transformation of voltage/current) dictate the resulting phase displacement.
You seem to disagree, yet from what I can tell, you are saying approximately the same thing... :confused:
 

Adjuster

Joined Dec 26, 2010
2,148
Manufacturers can tell how much VA their products can handle, but how much power that will represent in practice depends on the power factor, which they do not control. It is possible to use up the full rating of a transformer by loading it with a device such as a low loss capacitor, having a very nearly 90° phase angle and dissipating negligible power.

The wattage can be equal to the VA rating if the current is perfectly in phase with the voltage, and the current waveform is the same shape as the (normally sinusoidal) voltage.

Typically, because of power factors of less than one, transformers, cable, and other equipment have to be rated for higher currents than might be worked out from the power required. All these extra VA ratings are costly, so for industrial customers at least supply companies may charge more money if the PF is low.
 

Thread Starter

magnet18

Joined Dec 22, 2010
1,227
Manufacturers can tell how much VA their products can handle, but how much power that will represent in practice depends on the power factor, which they do not control. It is possible to use up the full rating of a transformer by loading it with a device such as a low loss capacitor, having a very nearly 90° phase angle and dissipating negligible power.

The wattage can be equal to the VA rating if the current is perfectly in phase with the voltage, and the current waveform is the same shape as the (normally sinusoidal) voltage.

Typically, because of power factors of less than one, transformers, cable, and other equipment have to be rated for higher currents than might be worked out from the power required. All these extra VA ratings are costly, so for industrial customers at least supply companies may charge more money if the PF is low.
that makes sense, thankyou
but what is PF?
 

thatoneguy

Joined Feb 19, 2009
6,359
I think PF is a valid abbreviation, since PFC is pretty much universally known as "Power Factor Correction" and is commonly used.

PF is listed to be "Power Factor" at abbreviations.com as well.
 

GetDeviceInfo

Joined Jun 7, 2009
2,192
You two seem to say the same thing, and it seems to make sense.



From what I got out of that, you seem to be saying the same thing.
(trig aint my strong point)


You seem to disagree, yet from what I can tell, you are saying approximately the same thing... :confused:
I misunderstood the question, thinking you were questioning the terms as ratings.
 
Top