Supercapacitor discharge question

Discussion in 'General Electronics Chat' started by sunjan18, Jan 18, 2015.

  1. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    Im looking at a super capacitor but I’m not sure what the discharge characteristics graphs actually mean. My goal is to be able to charge this capacitor, and discharge its energy into a battery.

    The specs are:
    Capacitor: 4.2V (5.5V peak) with 1F
    Battery I’m looking to charge: 4V with 1500mA

    There are two graphs in the image attached that I have a hard time understanding. First graph, the Constant Current Discharge @ 25C Charge 5.5V. The 1Amp goes from 5.5V and down to 0 in 4.5 seconds. There are two additional line graphs 2A and 4A. Second graph, Constant Power Discharge 25C Charge 5.5V, with line graphs of 1W, 5W, and 10W. It appears 10W line after 1000 msec, voltage becomes close to 2V.

    How would you discharge this capacitor at a rate of 1.5A with 4V to charge a battery? Thanks for reading and I would appreciate any help.

    Cheers
     
  2. crutschow

    Expert

    Mar 14, 2008
    13,009
    3,233
    Why do you want to use a capacitor to charge a battery? That seems rather inefficient.

    To keep the capacitor above the battery voltage as the capacitor discharges you could use a boost switching-regulator, but that likely wouldn't have an efficiency much above 75% at those voltages.
     
  3. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    I think this capacitor doesn't fit my needs but I'll use another large volume capacitor.

    The problem is how would I discharge the capacitor at a certain level of amperage to charge the battery? How would I go about doing this?
     
  4. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    For example if I used two 2.7v 3000F in series to get 5.4v @ 1500F. Do I use a step up dc-dc booster or voltage regulator to charge my 4V 1500mA battery? And when I do charge the battery, wouldn't the voltage of capacitor drop lower to a point where the booster or regulator required input voltage is not reach therefore not being able to give the desired output of 4V?
     
  5. ronv

    AAC Fanatic!

    Nov 12, 2008
    3,290
    1,255
    What we are trying to understand is why you want to use a capacitor instead of just a bigger or another battery. They store much more power for their size and cost.
     
  6. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    I shall shed more light and details once I can figure out how capacitors actually work. I purchased a step down regulator to drop down the 12V input voltage @ 1Amp as the power supply to reduce it to 5.4V-6V as a way to charge the capacitors. For example if the capacitors are 5.4V at 1500F, what would be the ideal charging voltage? And how would I calculate how fast it would charge until its full?
     
  7. ronv

    AAC Fanatic!

    Nov 12, 2008
    3,290
    1,255
  8. wayneh

    Expert

    Sep 9, 2010
    12,120
    3,046
    Unlike a battery, the voltage across a capacitor is proportional to its level of charge. The voltage drops as is discharges. A battery is more nearly constant, sagging just a little as it discharges.

    For a capacitor to drive a charging current thru a battery - which makes little sense as others have questioned - the voltage of the capacitor must exceed the voltage of the battery. Current will flow in proportion to the voltage difference and will stop when the voltages are equal. The initial current could be more than the battery can tolerate.

    You could add fancy boost circuitry to allow more energy to be removed from the capacitor and forced into the battery.

    If you have to limit current at the outset and then boost voltage as the cap discharges, you are in the realm of boost converters and battery charge controllers. These things are not cheap and simple. That's why folks are asking what you are hoping to accomplish. Often there are workarounds.
     
    ronv likes this.
  9. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    This is an experimentation project of simply learning about the transferring energy process, specifically the discharge phase, from capacitor to a device or battery. This is the capacitor I am looking at, BCAP3000:http://www.mouser.com/ds/2/257/Maxwell_K2Series_DS_1015370-4-341196.pdf. If I am charging a 4V 1500mA battery (Li-Ion) with two of these units, how will I determine the current it will discharge at and the time it takes for the voltage to drop from 5.4V down below an unreliable charge voltage?
     
  10. wayneh

    Expert

    Sep 9, 2010
    12,120
    3,046
    At 3000Farads, that capacitor will deliver 3000 coulombs per volt. F = coulombs/volt You are dropping 1.4V, so that means 4,200 coulombs will transfer to the battery if it is at the nominal voltage. If that were to happen in, say, 10 seconds, the average current would be 4,200/10= 420A !! Amps = coulombs/sec = dQ/dt =C•dV/dt where C is capacitance in farads

    Of course the average is not so important. The amperage will start at a very high level, limited by the combined series resistance of the capacitor and the battery, and will decay quickly as the capacitor voltage drops. Let's suppose the battery and capacitor add up to 100mΩ. The starting current is then I=V/R = 1.4/0.1 = 14A. You can solve the equations given to obtain the discharge curve against time.

    The data sheet shows the ESR of the capacitor is 0.29mΩ, and that limits the maximum current to 1.4/0.00029=4,827A, which exceeds the max rating for the capacitor.
     
    Last edited: Jan 19, 2015
  11. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    These will be linked in series, making it 15000F @ 5.4v. Is there a way to lower the amperage to usable current charge of lets say around 4-5V that can charge the battery? I was looking at possibly integrating resistors into the circuit during the discharge process to the battery. I believe the rate of current will also drop due to the additional Ohms.
     
  12. wayneh

    Expert

    Sep 9, 2010
    12,120
    3,046
    Yes, a resistor will limit current, but it will also just burn off energy. If you don't care about efficiency, that might be fine.

    Suppose you use a 0.5Ω resistor. This will limit the maximum current to I=V/R = 1.4/0.5 = 2.8A. (ignoring the resistance of the battery and the capacitor)

    The power dissipated in the resistor at max will be P=I^2•R = 2.8^2•0.5 = 3.92W = V•A = 1.4•2.8. You could safely use a resistor rated to 10W.

    While that resistor is dissipating 3.92W, the battery is receiving 2.8A•4V=11.2W. So the resistor is burning off 35% of what the battery is getting, or 26% of what the capacitor is delivering.

    A really good DC-DC converter might get you 85-90% efficiency. Not that big a gain, actually.
     
    Last edited: Jan 19, 2015
  13. sunjan18

    Thread Starter New Member

    Jan 18, 2015
    7
    0
    I'm concerned about the capacitor voltage dropping to a point where the minimum input for the step down is not acceptable. For example: 5.4V capacitor step down to 4V, but as it discharges the voltage will be reduced as it charges the battery, therefore possibly dropping it below the min. input voltage.
     
  14. ronv

    AAC Fanatic!

    Nov 12, 2008
    3,290
    1,255
    This is not exact, but maybe close enough for understanding.
    In this simulation the voltage on the left charges the capacitors (and the battery for a short time) to 5.4 volts.
    On the right is a simulation of a battery discharged to 3 volts.
    The rest is labeled.
    So a 1500 ma hour battery has 6 watt hours. The energy provided by the caps is about 4 watt hours.
    It is complex because the battery voltage goes up while the capacitor voltage goes down.
    The 1.5 ohm limits the maximum charge current.
     
  15. wayneh

    Expert

    Sep 9, 2010
    12,120
    3,046
    You don't need a step-down (buck) at all. What you need is a boost circuit that can use voltages well below the battery voltage but still pulse current into the battery. That's the only way to get more energy out of the capacitor. That boost circuit needs to include a current limiter suitable for your battery's max charge current. It should also include battery charging smarts, so that it knows when to go into a trickle or when to stop charging altogether.
     
  16. Convex

    New Member

    Apr 12, 2015
    4
    1
    Just FYI, when an ultracapacitor reaches half of its rated voltage, there is only 25 % of the stored energy remaining. For that reason, its rare for users to go to the effort of designing a system that can benefit from discharging the capacitor to less than about 40-50% of its maximum voltage.

    If you are basically putting the fully charged ultracapacitor in parallel with the battery, it will indeed charge the battery until the ultracapacitor falls to the voltage of the battery (4V in your case). If you didn't limit the current going into the battery somehow, the current would be so high that it would likely damage the battery. If the capacitor was too big, then the battery would be over-charged. For a lithium battery, that can cause metallic lithium to form in the cell, which can be quite dangerous. Anyway, assuming you didn't blow up the battery, once the ultracap and battery reach the same voltage, they should stay at the same voltage so long as they remain connected. The ultracapacitor might experience some 'self-discharge' but in that case the battery would then charge the ultracapacitor, keeping it at the same voltage as the battery.

    E = 0.5*C*V^2 gives the energy in a capacitor, so you can see why the voltage has a big effect.
     
Loading...