Recently I was trying to create a 400VDC power supply to charge capacitors with. I used a HID lighting ballast transformer to step up the voltage. However, it was not enough for what I was needing. I decided I'd try putting the capacitor in parallel with the transformer, just as you would if you were using it for its intended application. Now I have way more than enough voltage. Can anyone tell me why this is so? I have a suspicion that it has something to do with power factor correction. Basically then, my question is this: If the power factor is bad, will your voltage be lower than it would be if your power factor were corrected? I've looked high and low, but there's no easy answer to this. If this is not why I get more voltage with the capacitor in parallel, then what is the reason? Any insight would be greatly appreciated. Thanks!
Last edited: