a Power supply, a battery, wrong amps?

Thread Starter

Marc0

Joined Nov 28, 2011
42
Hi,
I have a battery and it's charger is faulty. The charger is a power supply that outputs 12 V DC @ 0,4 Amp. I have found another power supply that also outputs 12v DC but @ 1 Amp.
I know that when a power supply can output more amps than those needed by the appliance usually there's no problem, because the appliance will use just the amps it needs.
But my problem is that the battery while charging becomes really hot...someone told me that's because the new power supply is more powerful and the battery charges too fast...
How can I limit my power supply to output 0.4 Amp instead of 1 Amp?
Should I apply a resistor?
 

thatoneguy

Joined Feb 19, 2009
6,359
Your old power supply was unregulated, and charged the battery by limiting current through the transformer, up to a max of 400mA. You are now doing the same, but with about 2.5x current, so it'll get hot.

Neither method is the best way to keep a battery healthy (just the cheapest). What is the battery type being charged? This sort of setup is typically used on cheap cordless drills with non-removable battery packs.
 

Thread Starter

Marc0

Joined Nov 28, 2011
42
It's a battery from a black&decker drill/screwdriver tool... the battery pack is removable, but can't say what type is it for now... maybe ni-cd.
So applying a resistor will be useless? I just want to avoid the excessive battery heating, because I know this can damage it.
 

Audioguru

Joined Dec 20, 2007
11,248
A series resistor is the simplest way to reduce the charging current. Calculate the power rating of the resistor and buy one big enough so it doesn't overheat.
A "12V" 10-cells Ni-Cad or Ni-MH battery is fully charged at 14V to 15V. It is almost dead at 10V. Your 12V supply is overloaded when it tries to charge the 10V battery so of course the supply and the battery get hot.

We can't calculate the value of the resistor because we don't know the nominal voltage (12V?) of the battery. We also don't know how high your 12V/1A supply goes when its load is less than 1A when the battery is almost fully charged.
 

Thread Starter

Marc0

Joined Nov 28, 2011
42
OK,
so, I should first test how many (milli)amps the battery drains from my new power supply. Then (given this value should be higher than the original 400 milliamps) I have to calculate the resistor's value to reduce the actual amps output by the power supply to 400 milliamps, and its power rating...right?
 

Adjuster

Joined Dec 26, 2010
2,148
Those numbers would give a clue, but not the whole story. The power supply output and the battery voltage will both depend on the current actually flowing. What you would actually need to know is the power supply voltage when delivering 400mA. To find this you could substitute different values of resistors in series with the supply, being sure that they were all rated for an amp or so.

You might start by measuring the un-loaded voltage of the supply, to get an upper limit on the resistor needed.

R = (Vsupply-Vflatbattery*)/0.4A *To be safe, use the voltage of a "tired" battery.

The wattage will be 0.4A times (Vsupply-Vflatbattery), so the resistor needs to be rated for more than that.

This will (probably) be too high a resistance, as actually the supply voltage falls on load, so try measuring the supply voltage with this resistor and the battery connected, and recalculate as necessary.

If all this sounds too difficult, you might try using a filament lamp. A 12V 6W type might come out about right - but this would require trial and error. Unfortunately, lamps don't come in exact sizes like resistors*, but they are easily available, and their resistance variation with voltage actually helps stabilise the charging current.

*But you could put a higher value resistor in parallel with a lamp that did not quite give you enough current.
 
Last edited:

studiot

Joined Nov 9, 2007
4,998
Just get a proper B&D replacement charger before you damage the battery.

A quick check on Ebay lists 12v chargers at about one third the cost of a new battery.

The proper charger usually has status indicator lights and appropriate control circuitry.
 

Thread Starter

Marc0

Joined Nov 28, 2011
42
Did some amp measurements and found something strange...
While charging, the battery drains from the power supply 3.5 Amps! This value constantly decreased but @ 2,8 Amps I had to disconnect the battery because the power supply was almost melting (being designed to provide 1 amp!).
How is this possible? The original charger was rated 400 milliamps...the new is rated 1 amp, why the battery absorbs so much more?
 

Audioguru

Joined Dec 20, 2007
11,248
Maybe the original was a current-limited charger and the new power supply is not a charger because it does not have any current limiting.
 

Adjuster

Joined Dec 26, 2010
2,148
At this point, I really have to agree with Studiot: you would be better just to get a new charger, if it is not clear to you that an ordinary power supply is different from a battery charger. Many power supplies are designed to provide a more or less constant voltage. They are designed to power things like (say) radios where the current taken does not vary too sharply with voltage.

Battery chargers have to provide a safe amount of current into a battery, which will have a characteristic at any one time of very steeply varying current with voltage, but the voltage for a given current will change over time as the battery charges. Chargers therefore generally use some form of current limitation, at the very least a tailored output resistance.

Connecting a standard voltage power supply directly to a battery is a recipe for disaster. There is nothing very complicated about this, just a question of voltages and internal resistances. If the power supply open-circuit voltage is sufficiently above the battery voltage, and its internal resistance is low enough, the current will get too big!

That's why I suggested measuring the open-circuit voltage first, and putting a resistor in the way.
 

Thread Starter

Marc0

Joined Nov 28, 2011
42
Hmm I understand.
By the way, just to clarify: I had a look @ the internals of both the old 0.4 amp and the new 1 amp charger/power supply... well, they are quite similar, and there's no sign of current limiting: they both have just a simple transformer and 4 diodes for rectification. The only difference is that the new one has also a big capacitor applied just before the output.

Another measurement: the open circuit voltage from the new power supply is 18 volts... not 12 as written on top of it... I'm starting to think that the battery is faulty.

Another thing to clarify: I of course know that the best thing is to buy a new charger, and I probably will do it. I was just curious about what was going on, and wanted to investigate :)
 

studiot

Joined Nov 9, 2007
4,998
Hello Marco.

I had a look @ the internals of both the old 0.4 amp and the new 1 amp charger/power supply... well, they are quite similar, and there's no sign of current limiting: they both have just a simple transformer and 4 diodes for rectification. The only difference is that the new one has also a big capacitor applied just before the output.

Another measurement: the open circuit voltage from the new power supply is 18 volts... not 12 as written on top of it...
This explains a lot.

Battery chargers are generally different from mains power supplies for electronic equipment.

Batteries are best charged via the raw output from rectifiers. This is not AC but it is not DC like either. It is a unidirectional current (voltage) that is pulsing periodically but does not change polarity as AC does.

I have shown this in Figs 1 and 2.
There are two types of rectifier - half wave and full wave as shown.
It is important to note that the current or voltage you measure will be the average or effective value, which is rather less than the peak.

This effective voltage should be no more than a couple of volts greater than the nominal battery voltage.

The Black & Decker charger is of this type.

On the other hand, power supplies do not want this pulsing effect. They want a steady DC.
To achieve this we add a (large) capacitor across the output of the rectifiers, as shown in Fig3.
This produces a (nearly) steady ouput voltage.
Since the capacitor charges to the peak voltage the ouput is higher than without the capacitor, as shown in Fig4.

Your second power supply is of this type.

Hopefully you can now see the difference and why the second power supply drives more (too much?) current through the battery on recharge. It's output voltage is too high.
 

Attachments

Last edited:
Top