I'm looking at using a PicoUPS with my network equipment to protect them with a battery (SLA); right now I have a 12V/16A supply that is adjustable but very tightly regulated. The hitch with the PicoUPS is that its input voltage needs to be at least battery float voltage (~13.2V) to charge the battery properly, and its output is unregulated... I can adjust the 12V supply to put out 13.5V, but then my output is anywhere from ~11.5V (low battery) to 13.5V (regulated supply input).
My devices should be fine with 11.5-12.5, but 13.5 is >10% over the specified input voltage on them, and I'd really rather not go over 13V (ideally 5%, or 12.6V). My first thought was to throw 2x 10A10 diodes in series on the output for a 1.4V drop @ 10A capacity, but the diodes just have a constant voltage drop, so when the circuit switches to battery, 13.2V at the battery is instantly 11.8V, and as the battery voltage drops to/below 12V, the output ends up south of 10.6V.
What would be the best way to work around this? Is there some sort of regulator that can output up to full input voltage (so it's dropping ~1.5V on PSU, but scales down on battery and maintains a constant output voltage as the battery voltage drops)? Is there some way I could switch the output path through either a regulator or direct output based on whether there is input voltage from the PSU?
P.S.: The 16A supply is already in use powering all the equipment. I suppose I could use an off-the-shelf UPS to power that, but this strikes me as inefficient (UPS downconverts to 12VDC -- or maybe 24VDC -- to charge its battery, then on battery power it upconverts to 120VAC only to immediately get downconverted back to 12VDC in my supply). Plus, if I build it myself, I have only myself to blame if something goes wrong, and if something breaks I'm in a better place to fix it
My devices should be fine with 11.5-12.5, but 13.5 is >10% over the specified input voltage on them, and I'd really rather not go over 13V (ideally 5%, or 12.6V). My first thought was to throw 2x 10A10 diodes in series on the output for a 1.4V drop @ 10A capacity, but the diodes just have a constant voltage drop, so when the circuit switches to battery, 13.2V at the battery is instantly 11.8V, and as the battery voltage drops to/below 12V, the output ends up south of 10.6V.
What would be the best way to work around this? Is there some sort of regulator that can output up to full input voltage (so it's dropping ~1.5V on PSU, but scales down on battery and maintains a constant output voltage as the battery voltage drops)? Is there some way I could switch the output path through either a regulator or direct output based on whether there is input voltage from the PSU?
P.S.: The 16A supply is already in use powering all the equipment. I suppose I could use an off-the-shelf UPS to power that, but this strikes me as inefficient (UPS downconverts to 12VDC -- or maybe 24VDC -- to charge its battery, then on battery power it upconverts to 120VAC only to immediately get downconverted back to 12VDC in my supply). Plus, if I build it myself, I have only myself to blame if something goes wrong, and if something breaks I'm in a better place to fix it