Edit:
I found some of the answers on this page:
http://www.pcguide.com/ref/power/ext/basicsPower-c.html
But where it reads:
where "cosine" is the trigonometric function. "cosine(phase)" is also called the power factor of the load. Let's try an example. Let's suppose we are trying to run a power supply and the power supplied is 115V voltage and 2A of current. The apparent power is 115 * 2 = 230 VA. If the nature of the power supply is that its voltage and current are out of phase by 50 degrees, then the power factor is cosine(50º) = 0.642 (sometimes expressed as 64.2%) and the power used by the load is 148 W.
Wasn't the apparent power 230W?
So the true power used is only 148W.
Why is this a problem for power utility companies? I thought that poor power factor meant that power was being wasted, but instead these calculations seem to show that less power is actually needed.
This is very confusing.
I found some of the answers on this page:
http://www.pcguide.com/ref/power/ext/basicsPower-c.html
But where it reads:
where "cosine" is the trigonometric function. "cosine(phase)" is also called the power factor of the load. Let's try an example. Let's suppose we are trying to run a power supply and the power supplied is 115V voltage and 2A of current. The apparent power is 115 * 2 = 230 VA. If the nature of the power supply is that its voltage and current are out of phase by 50 degrees, then the power factor is cosine(50º) = 0.642 (sometimes expressed as 64.2%) and the power used by the load is 148 W.
Wasn't the apparent power 230W?
So the true power used is only 148W.
Why is this a problem for power utility companies? I thought that poor power factor meant that power was being wasted, but instead these calculations seem to show that less power is actually needed.
This is very confusing.
Last edited: