If I built a basic DC power supply with:
230VAC input to a 12VAC output transformer rated at 20VA
12VAC input to a FWB rated at 50V 4A
12Vrms filtered with a 100uF Cap
How do you work out the max current sourced before the output voltage starts to drop and how can this current be limited to a smaller or larger amount?
230VAC input to a 12VAC output transformer rated at 20VA
12VAC input to a FWB rated at 50V 4A
12Vrms filtered with a 100uF Cap
How do you work out the max current sourced before the output voltage starts to drop and how can this current be limited to a smaller or larger amount?