Question about source current from a basic DC power supply

Thread Starter

hunterage2000

Joined May 2, 2010
487
If I built a basic DC power supply with:

230VAC input to a 12VAC output transformer rated at 20VA
12VAC input to a FWB rated at 50V 4A
12Vrms filtered with a 100uF Cap

How do you work out the max current sourced before the output voltage starts to drop and how can this current be limited to a smaller or larger amount?
 

pwdixon

Joined Oct 11, 2012
488
The output will drop at all currents, the amount of droop (ripple) just depends on the load and the capacitor size.
 

Thread Starter

hunterage2000

Joined May 2, 2010
487
So if I put a voltage regulator at the end, the voltage will be constant up to the rated source current from the regulator?
 

crutschow

Joined Mar 14, 2008
34,452
The RMS current rating of the transformer is 20VA/12V = 1.66A. You should derate that about 50% for a diode-capacitor filter, so the maximum DC output current would be about 0.8A

The voltage output will start to drop as soon as you draw current due to the transformer winding resistances. The rated voltage will occur at the rated current.
For a full-wave rectifier the output ripple is approximately
.

For 100 μF filter capacitor, the ripple would be 1V for a 10mA load, so you need a much larger capacitor to generate a reasonable ripple value for an 0.8A load.

The current used by the supply is determined by the load connected to the supply. The only current limit you would need is if you want to protect against a short circuit condition.

Edit: A regulator output voltage will be constant provided it's input voltage (minimum voltage including ripple) is above the output voltage plus the minimum voltage drop of the regulator.
 

bertus

Joined Apr 5, 2008
22,277
Hello,

Yes and no.
When the capacitor is to small, the ripple can be to large to get regulated by the regulator, as the regulator will need a certain voltage drop accross it.

Bertus
 

AnalogKid

Joined Aug 1, 2013
11,045
It doesn't. The diode cap filter is what you describe in your first post, a FWB and 100 uF. Each half cycle, the peak AC coming from the transformer charges up the capacitor. Continuously, the load draws current out of the capacitor. So there is what is called Ripple voltage on the capacitor as it goes from being topped off at the line voltage peaks to being partially discharged by the load between those peaks. Thus, the ripple voltage looks like a sawtooth wave (exponential charge up, exponential discharge) sitting on top of the average DC voltage at the capacitor. The larger the capacitor, the lower the ripple voltage.

Separate from that is the relationship between the transformer ratings and the load value. You can pull the full 1.66 A from the transformer, but the ripple voltage will be large, the average voltage across the cap will be low, and the negative peaks of the ripple voltage riding on the average DC value will be very low. However, if you need only 3.3V DC, a linear or switching regulator will effectively "clip off" the ripple, giving you clean DC.

So yes, the ripple increases and the output voltage decreases the instant there is any load on the capacitor. But if you know what output you need, you can design a circuit to get there.

ak
 

crutschow

Joined Mar 14, 2008
34,452
.....................
Separate from that is the relationship between the transformer ratings and the load value. You can pull the full 1.66 A from the transformer.....................
Not without likely burning out the transformer in this circuit.
The 1.66 A rating is for a sine output current. A diode-capacitor circuit pulls large peak currents from the transformer which significantly increases the transformer heating from the I^2R losses in the transformer winding resistance.
For typical ripple values, the average DC current from the full-wave diode-capacitor circuit should generally be no more than 50-60% of the transformer's rating to avoid overheating the transformer.
 
Last edited:
Top