Question about source current from a basic DC power supply

Discussion in 'General Electronics Chat' started by hunterage2000, May 6, 2015.

  1. hunterage2000

    Thread Starter Active Member

    May 2, 2010
    400
    0
    If I built a basic DC power supply with:

    230VAC input to a 12VAC output transformer rated at 20VA
    12VAC input to a FWB rated at 50V 4A
    12Vrms filtered with a 100uF Cap

    How do you work out the max current sourced before the output voltage starts to drop and how can this current be limited to a smaller or larger amount?
     
  2. pwdixon

    Member

    Oct 11, 2012
    488
    56
    The output will drop at all currents, the amount of droop (ripple) just depends on the load and the capacitor size.
     
  3. hunterage2000

    Thread Starter Active Member

    May 2, 2010
    400
    0
    So if I put a voltage regulator at the end, the voltage will be constant up to the rated source current from the regulator?
     
  4. crutschow

    Expert

    Mar 14, 2008
    12,977
    3,220
    The RMS current rating of the transformer is 20VA/12V = 1.66A. You should derate that about 50% for a diode-capacitor filter, so the maximum DC output current would be about 0.8A

    The voltage output will start to drop as soon as you draw current due to the transformer winding resistances. The rated voltage will occur at the rated current.
    For a full-wave rectifier the output ripple is approximately
    [​IMG].
    For 100 μF filter capacitor, the ripple would be 1V for a 10mA load, so you need a much larger capacitor to generate a reasonable ripple value for an 0.8A load.

    The current used by the supply is determined by the load connected to the supply. The only current limit you would need is if you want to protect against a short circuit condition.

    Edit: A regulator output voltage will be constant provided it's input voltage (minimum voltage including ripple) is above the output voltage plus the minimum voltage drop of the regulator.
     
  5. bertus

    Administrator

    Apr 5, 2008
    15,638
    2,343
    Hello,

    Yes and no.
    When the capacitor is to small, the ripple can be to large to get regulated by the regulator, as the regulator will need a certain voltage drop accross it.

    Bertus
     
  6. hunterage2000

    Thread Starter Active Member

    May 2, 2010
    400
    0
    What is a diode-cap filter and what hows does it limit the current to 50%?
     
  7. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,515
    1,246
    It doesn't. The diode cap filter is what you describe in your first post, a FWB and 100 uF. Each half cycle, the peak AC coming from the transformer charges up the capacitor. Continuously, the load draws current out of the capacitor. So there is what is called Ripple voltage on the capacitor as it goes from being topped off at the line voltage peaks to being partially discharged by the load between those peaks. Thus, the ripple voltage looks like a sawtooth wave (exponential charge up, exponential discharge) sitting on top of the average DC voltage at the capacitor. The larger the capacitor, the lower the ripple voltage.

    Separate from that is the relationship between the transformer ratings and the load value. You can pull the full 1.66 A from the transformer, but the ripple voltage will be large, the average voltage across the cap will be low, and the negative peaks of the ripple voltage riding on the average DC value will be very low. However, if you need only 3.3V DC, a linear or switching regulator will effectively "clip off" the ripple, giving you clean DC.

    So yes, the ripple increases and the output voltage decreases the instant there is any load on the capacitor. But if you know what output you need, you can design a circuit to get there.

    ak
     
  8. crutschow

    Expert

    Mar 14, 2008
    12,977
    3,220
    Not without likely burning out the transformer in this circuit.
    The 1.66 A rating is for a sine output current. A diode-capacitor circuit pulls large peak currents from the transformer which significantly increases the transformer heating from the I^2R losses in the transformer winding resistance.
    For typical ripple values, the average DC current from the full-wave diode-capacitor circuit should generally be no more than 50-60% of the transformer's rating to avoid overheating the transformer.
     
    Last edited: May 6, 2015
  9. hunterage2000

    Thread Starter Active Member

    May 2, 2010
    400
    0
    So say I wanted to limit the current from 1.66A to 1A, would I put a current limiting resistor just after the FWB.
     
  10. crutschow

    Expert

    Mar 14, 2008
    12,977
    3,220
    You normally don't need to limit the current as that is determined by the load.
    Why do you think you need to limit the current?
     
  11. pwdixon

    Member

    Oct 11, 2012
    488
    56
    or you could add a fuse.
     
  12. hunterage2000

    Thread Starter Active Member

    May 2, 2010
    400
    0
    This would be for varying the current to suit the load.
     
  13. dl324

    Distinguished Member

    Mar 30, 2015
    3,211
    619
    It would be best to add it to the regulator if it doesn't already have it.
     
  14. crutschow

    Expert

    Mar 14, 2008
    12,977
    3,220
    Normally a load takes only the current it requires as determined by the voltage.
    What type of load are you referring to?
     
Loading...