I am having trouble grasping how these three concepts can be reconciled in a given circuit. I’ll use the following two examples with two different power factors to illustrate:
Two-wire, single phase circuit; Load of 1,000 watts @ 100 volts, connected to load by wire that has a resistance of 0.2 ohms. The voltage of the generator can be adjusted to make sure that the load sees 100 V.
I will calculate the generator voltage set point and the resulting power that the generator must produce when the load has a power factor of 1.0 and 0.5. Then I can calculate power loss in two different ways: using the formula PL = 2*r* I^2, and also subtracting the power seen by the load from the power required of the generator.
For the voltage drop formula I’ll skip reactance to simplify, so VD = 2*r*I
For current required by load I’ll use I = watts / (V*PF)
When PF = 1.0, I = 10 amps
VD = 2*10*0.2 = 4 volts
PL = 2*0.2*100 = 40 watts
So we set generator at 104 volts and generator power = 104*10 = 1,040 watts; load sees 100 V, so power at load is 100*10 = 1,000 watts; power loss is 1,040 – 1,000 = 40 watts, same as calculated above.
When PF = 0.5, I = 20 amps
VD = 2*20*.2 = 8 volts
PL = 2*0.2*400 = 160 watts
Keeping in mind that the generator generates watts not VA, we set generator at 108 volts and generator power = 108*10=1,080 watts; load sees 100 V so power at load is still 1,000 watts; power loss is 1,080 – 1,000 = 80 watts, which is not equal to the 160 watts calculated above (its actually 160*PF). Where happened to the extra 80 watts of power loss??? Does the generator somehow not have to produce this power?
Can someone please help clarify where the inconsistency / problem is with the reasoning here?
Thank you!
Two-wire, single phase circuit; Load of 1,000 watts @ 100 volts, connected to load by wire that has a resistance of 0.2 ohms. The voltage of the generator can be adjusted to make sure that the load sees 100 V.
I will calculate the generator voltage set point and the resulting power that the generator must produce when the load has a power factor of 1.0 and 0.5. Then I can calculate power loss in two different ways: using the formula PL = 2*r* I^2, and also subtracting the power seen by the load from the power required of the generator.
For the voltage drop formula I’ll skip reactance to simplify, so VD = 2*r*I
For current required by load I’ll use I = watts / (V*PF)
When PF = 1.0, I = 10 amps
VD = 2*10*0.2 = 4 volts
PL = 2*0.2*100 = 40 watts
So we set generator at 104 volts and generator power = 104*10 = 1,040 watts; load sees 100 V, so power at load is 100*10 = 1,000 watts; power loss is 1,040 – 1,000 = 40 watts, same as calculated above.
When PF = 0.5, I = 20 amps
VD = 2*20*.2 = 8 volts
PL = 2*0.2*400 = 160 watts
Keeping in mind that the generator generates watts not VA, we set generator at 108 volts and generator power = 108*10=1,080 watts; load sees 100 V so power at load is still 1,000 watts; power loss is 1,080 – 1,000 = 80 watts, which is not equal to the 160 watts calculated above (its actually 160*PF). Where happened to the extra 80 watts of power loss??? Does the generator somehow not have to produce this power?
Can someone please help clarify where the inconsistency / problem is with the reasoning here?
Thank you!