So here's the deal: I made a power supply with a step down transformer (60Hz) 220v-24v operating at 1amp, with a full bridge rectifier(1N4001). i have a 1000uF filter capacitor (50v). and an LM338T adjustable voltage regulator. I need it to output 2-30v. It works...
my professor taught us that our filter capacitor's output is defined by the formula:
Vrms=(current)/(4(sqroot of 3) (frequency)(capacitance))
VDC= (rectified voltage) - (current)/4(frequency)(capacitance)
as far as I know the voltage regulator needs a bigger input voltage than what it can put out.
but calculating for the capacitors output dc voltage: (I assumed that the diodes had0.7v as their voltage drop)
VDC= (24(sqroot of 2) - 0.7v - 0.7v) - ( (1amp)/(4)(60hz)(1000uF))
i get 28.37v for the regulators input (the caps output) which doesn't make sense when I can easily get 32v as its output.
Am I doing something wrong here?
my professor taught us that our filter capacitor's output is defined by the formula:
Vrms=(current)/(4(sqroot of 3) (frequency)(capacitance))
VDC= (rectified voltage) - (current)/4(frequency)(capacitance)
as far as I know the voltage regulator needs a bigger input voltage than what it can put out.
but calculating for the capacitors output dc voltage: (I assumed that the diodes had0.7v as their voltage drop)
VDC= (24(sqroot of 2) - 0.7v - 0.7v) - ( (1amp)/(4)(60hz)(1000uF))
i get 28.37v for the regulators input (the caps output) which doesn't make sense when I can easily get 32v as its output.
Am I doing something wrong here?