I am trying to get my head around how power converters regulate the output voltage at no-load conditions. My output pulses from substantially full load ~200mA to substantially no load 0A at frequencies anywhere from 100Hz to 100kHz.
My question is quite clear already but, how exactly does a converter keep output voltage constant if output current and therefore power are zero? If P = VI, P and I are both zero but this does not mean V is zero. I get this. But by V = IR, if current is zero then voltage should also be zero. But we intend to regulate it to a certain value.
I understand there will be some current draw by the multipliers, and I have heard some people say we must place a bleeder resistor on the output to ensure a minimum load current.
But it is to my understanding bleeder resistors drain the charge from the capacitor. I do not understand why this is useful since we want to maintain the capacitor voltage, not drain it. If this is indeed the correct approach, is there a way to calculate the value to ensure a good level of Power dissipation.
I assume the answer is quite simple but Google searches come up with all sorts of things about burst mode control, etc which is complicated. I'd like to know first the simple principle. My topology is a buck-current fed push-pull converter, if this helps, but also interested in fly-back converter.
My question is quite clear already but, how exactly does a converter keep output voltage constant if output current and therefore power are zero? If P = VI, P and I are both zero but this does not mean V is zero. I get this. But by V = IR, if current is zero then voltage should also be zero. But we intend to regulate it to a certain value.
I understand there will be some current draw by the multipliers, and I have heard some people say we must place a bleeder resistor on the output to ensure a minimum load current.
But it is to my understanding bleeder resistors drain the charge from the capacitor. I do not understand why this is useful since we want to maintain the capacitor voltage, not drain it. If this is indeed the correct approach, is there a way to calculate the value to ensure a good level of Power dissipation.
I assume the answer is quite simple but Google searches come up with all sorts of things about burst mode control, etc which is complicated. I'd like to know first the simple principle. My topology is a buck-current fed push-pull converter, if this helps, but also interested in fly-back converter.