newbie - trying to modify a power supply

Discussion in 'The Projects Forum' started by c0de3, May 1, 2009.

  1. c0de3

    Thread Starter Active Member

    May 1, 2009
    50
    1
    Hi all!

    I'm looking forward to learning a lot on this forum! I loved the e-book.

    What I'm doing I have a 12vdc 500mA power supply. Problem is the battery I'm charging is a 9.6 volt battery. So I'd like to drop the power supply down to ~9.6volts or so. But I'd still like to deliver the 500mA to the battery for charging. (That is the max it is rated to take for a charge).

    I've applied Ohm's law and think I need a 4.8Ohm resister in the circuit. From what I learned on the site, I'm thinking the resistor and the battery pack should be in series. My thought is: In series the current is constant in the circuit and the voltage changes, thus the batt pack with get 9.6 volts and the resistor will get 2.4.

    My power calculation shows the resistor will eat about 1.2 watts.

    My issue is, when I do the math on the battery pack it comes up that it would have a -500mA current and the resistor would have a +500mA current. Which I guess makes sense that they equal 0... but wouldn't I want the positive current to flow into the battery I'm charging.. thus charging the battery?

    This part confuses me. I may be completely off track, so any assistance would be great!

    Thanks for any assistance!
     
  2. bertus

    Administrator

    Apr 5, 2008
    15,649
    2,348
    Last edited: May 2, 2009
  3. balisong

    Member

    Feb 26, 2008
    27
    0
    You're on the right track. That resistor value will keep the current limited to a safe level, but it doesn't limit the voltage as the pack charges. You could use this for a quick charge, but not for a long charge. A fully charged pack would dissipate all that current as heat (9.6V x 500mA = ~5W) and might get damaged. A good battery charger will start with a fast charge and finish with a trickle charge. Look into using diodes to drop most of the voltage, and calculate a new resistor value using the remaining voltage difference.
    Good luck, and ask if you need help/clarification.
     
  4. studiot

    AAC Fanatic!

    Nov 9, 2007
    5,005
    513
    First and foremost you must have a greater voltage output from the charger than the battery terminal voltage or you will never charge it.

    Secondly the output voltage of most mains adapters varies dramatically with current. The rating written on them is the nominal voltage at rated current. Off load they will display several volts higher.

    So you need to check at what current the adapter gives 12 volts.


    When it comes to controlling this excess voltage any device which drops it, resistor, diode or transistor will dissapate the same amount of heat given by the curent times the voltage drop.

    The problem with simply dropping the supplied voltage from an adapter is that the supply varies dramatically with current and in particular rises as current decreases, trying to continues charging a fully charged battery, thereby damaging it.

    The solution to this is to pass the charger output through a small constant current unit. This can be a simple pass transistor with suitable biasing. Batteries are designed to be charged like this at what is known as the 0.1C rate. This is 0.1 times the amp or millamp hour rating of the battery. The battery should be able to accomodate this indefinitely if left on charge. In practise nothing is perfect and charging takes 12 to 14 hours not 10.
    Faster charging is possible but reduces the number of charge/discharge cycles before the battery fails (batery life).
     
Loading...