dc - dc up converter delema

Discussion in 'Power Electronics' started by longpole001, Aug 22, 2018.

  1. longpole001

    Thread Starter Member

    Sep 16, 2014
    59
    4
    Hi guys ,
    recently i have had a need to design with a DC -DC converter 1v- 5v input to 5v output in the design ,
    mainly as the battery has a usable range of 4.2v- 2.5v , with a 3400maH Lion ,
    which will supply a 3v3 LDO regulator( LDO =3.5v min input) ,
    the system also uses USB power as 2nd input supply to the LDO ,
    The average current from LDO is 130 ma at 3v3 , with peeks at 500ma for <100ms , so a cap of 470uf would be help on the output

    The problem is as always any DC -DC Up converter is very current inefficient when input voltage is 1- 2 volts lower ( as in my need)r than the output voltage of 5v ( see example table from manufacture )

    It seem to me that it would be a lot better power design to not allow the dc converter even to be operational until such time as the input voltage falls to the same level as near as possible to LDO input shutdown level at which time the DC converter is then enabled to boost to 5v in this case

    Batteries by their nature tend to be very bad when being monitored where the voltage will drop as the current changes so a bigger margin for when to shut over to the dc converter must be considered

    I am contemplating a ADC monitor for battery with a cpu controlling cut over , where when activated it wont go back to the battery alone ,


    has anyone done this approach , and found it to be effective or is it just extra work for little gain from those who have designed these before


    Cheers

    Sheldon

    dc dc converter.JPG
     
  2. longpole001

    Thread Starter Member

    Sep 16, 2014
    59
    4
    after some research , there is few circuits that have combined voltage monitoring to switch between a LDO and up converter using a pic

    Mainly cos i think with a fluctuating current load on the battery chases are your voltage monitor will tell the up converter to cut in early
    , but there is significant battery current saved by not allowing the dc converter to cut in until it is required

    some DC -DC converters converters have a fixed/ adj output as say 3v3 , but allow the input to be higher , but still ensure the the output stays is at 3v3 , but again they suck current at 1.3- 1.8 ratio higher level for the voltage output given when the input voltage drops 1v below the required output

    the same dc - dc converters appear very poor LDO alternatives when looking at when the voltage input is higher than the output at low voltage
    but they are simple to implement , more so than doing a voltage monitor and cut over arrangement for what i am seeing

    any other views would be welcomed on this item

    cheers
     
  3. ebp

    Well-Known Member

    Feb 8, 2018
    1,622
    553
    I have a great deal of difficulty understanding what you have written.

    Switch mode converters are power converters. If you want 3.3 volts out and have 1 volt in, the input current will be 3.3 times the output current if the converter is 100% efficient. If the input voltage is 2 volts, the input current will be 1.65 times the output current, again for 100% efficiency.

    The "SEPIC" (single-ended primary inductance converter) can produce an output voltage that is higher or lower than the input voltage. It seems to be the most popular topology for low power circuits running from batteries. But like any other, it can't make power, just convert it. There are integrated switchers (controller & required power switch, some with on-chip diode, some not) that are very efficient at low power. If you require an output voltage that is only slightly less than the input voltage, a linear regulator can be as efficient, but such a requirement is rare. A linear reg can be more efficient than a badly designed or inappropriately-chosen switcher, even when the input-output differential is moderate if the current required is low.
     
Loading...