The difference is that the terms refer to two complete different things. Watts is power utilized or consumed. VA is a measure of the transformers magnetic coupling which provides a volt/amp ratio from input to output.OK, I've been wondering this for awhile
What's the difference between VA and watts?
I know transformers are rated in VA (presumably volts * amps), but houses draw kilowatts, what's the difference??
A purely inductive load will return 100% of the power to the line, it's just "used" power.and something similar happens for inductance?
Am I wrong?
Don't mean to hijack magnet's thread but I have always wondered the same myself.Simple answer is power factor
Watts= VA*the power factor.
Loads generally are not pure resistive. Loads have inductive and/or capacitive properties.
When this happens current and voltage are not in phase.
A pure resistive load both current are in phase. This gives a power factor of 1.
So watts= VA*1.
For other loads Google power factor calculations. Just too much to post.
Simple answer is power factor
Watts= VA*the power factor.
Loads generally are not pure resistive. Loads have inductive and/or capacitive properties.
When this happens current and voltage are not in phase.
A pure resistive load both current are in phase. This gives a power factor of 1.
So watts= VA*1.
For other loads Google power factor calculations. Just too much to post.
You two seem to say the same thing, and it seems to make sense.The thing to remember is that real power is only generated by the part of the voltage and current that are in phase at any instant. Thus the phase shift between voltage and current due to inductance or capacitance will cause the real power to be less than a multiplication of the average measured volts times the average measured amps (VA) gives.
From what I got out of that, you seem to be saying the same thing.In some contexts such as AC motors, the difference between VA ratings and wattage is mainly a question of phase angle, so that a power factor is defined as the cosine of the phase angle between voltage and current.
For transformers, the nature of the load waveform may also be very important. This especially so when the transformer is feeding via a rectifier into a large reservoir capacitor. In this case, the current waveform may be sharply peaked, with a narrow conduction angle. As a result, the RMS current may be relatively large compared to the average DC output current.
The transformer windings can only handle the RMS current they are rated for without getting too hot, so the more sharply peaked the waveform, the larger the necessary VA rating needed to deliver a given output power.
You seem to disagree, yet from what I can tell, you are saying approximately the same thing...The difference is that the terms refer to two complete different things. Watts is power utilized or consumed. VA is a measure of the transformers magnetic coupling which provides a volt/amp ratio from input to output.
-------------------------------------------------------although phase shifting and power factor are implicate, it has little to do with the VA rating of a transformer. Such a rating indicates the 'capacity' of the trans. Fully loaded, a transformer approaches unity, but consumes no power (less losses). The load (not the transformation of voltage/current) dictate the resulting phase displacement.
that makes sense, thankyouManufacturers can tell how much VA their products can handle, but how much power that will represent in practice depends on the power factor, which they do not control. It is possible to use up the full rating of a transformer by loading it with a device such as a low loss capacitor, having a very nearly 90° phase angle and dissipating negligible power.
The wattage can be equal to the VA rating if the current is perfectly in phase with the voltage, and the current waveform is the same shape as the (normally sinusoidal) voltage.
Typically, because of power factors of less than one, transformers, cable, and other equipment have to be rated for higher currents than might be worked out from the power required. All these extra VA ratings are costly, so for industrial customers at least supply companies may charge more money if the PF is low.
I misunderstood the question, thinking you were questioning the terms as ratings.You two seem to say the same thing, and it seems to make sense.
From what I got out of that, you seem to be saying the same thing.
(trig aint my strong point)
You seem to disagree, yet from what I can tell, you are saying approximately the same thing...