Why a higher voltage?

Thread Starter

charley1957

Joined Jan 23, 2009
2
Recently I was trying to create a 400VDC power supply to charge capacitors with. I used a HID lighting ballast transformer to step up the voltage. However, it was not enough for what I was needing. I decided I'd try putting the capacitor in parallel with the transformer, just as you would if you were using it for its intended application. Now I have way more than enough voltage. Can anyone tell me why this is so? I have a suspicion that it has something to do with power factor correction. Basically then, my question is this: If the power factor is bad, will your voltage be lower than it would be if your power factor were corrected? I've looked high and low, but there's no easy answer to this. If this is not why I get more voltage with the capacitor in parallel, then what is the reason? Any insight would be greatly appreciated. Thanks!
 
Last edited:

Thread Starter

charley1957

Joined Jan 23, 2009
2
After I got the voltage I needed, I ran it through a bridge rectifier I made from microwave oven diodes to get the DC I needed. In fact, after I put the capacitor in the circuit, I had so much voltage I had to run the AC in through a rheostat to bring it down to what I needed. Everything works like I want it to, I have exactly the voltage I need now, I just want to know WHY the addition of the capacitor gave me dramatically more voltage than I had without it. Right now it's just a matter of trying to get educated in this matter.
 

Wendy

Joined Mar 24, 2008
23,421
A capacitor turns it into the peak value, instead of an RMS derivative. If you used a bridge it likely became 1.4X the voltage (RMS into peak).
 
Top