Power supply overheating

Discussion in 'The Projects Forum' started by dataworx, Nov 27, 2013.

1. dataworx Thread Starter New Member

Nov 27, 2013
8
0
Pardon my noobness, I'm sure there is a really fundamental blunder here but I cannot think what it may be.

I'm trying to charge/maintain 12 volt Sealed Lead Acid (SLA) batteries. These apparently require at least 12.9 volts to get them charging, and around 13.5 volts as a float or standby voltage.

I don't have a dedicated multi-stage charger so I'm using a regulated DC power supply that outputs a little under 12 volts, rated output current is 1.5A. I'm feeding this through a DC DC booster (TK0259) or (LM2577) to take the voltage up to 13.6 volts. Prior to charging, the battery has a voltage of 12.01 volts.

When I connect and switch on the power supply I measure about 12.5volts at the battery terminals, but about 3.25 volts, and falling, at the power supply output. The power supply starts heating up so I've switched everything off while I seek advice. I guess I've violated some law of physics, can someone explain what's happening and how I can resolve this?

2. ronv AAC Fanatic!

Nov 12, 2008
3,401
1,471
The battery may be taking more than 1.5 amps of charge current so your 12 volt supply gets hot. You need to limit the charge current somewhere in the circuit.

3. dataworx Thread Starter New Member

Nov 27, 2013
8
0
Thanks ronv, I'm investigating how to limit the current drawn. Does this also explain the voltage drop seen at the power supply?

4. MikeML AAC Fanatic!

Oct 2, 2009
5,451
1,066
A lead-acid 12V battery whose resting voltage is 12.0V is mostly discharged. If you try to connect a power supply with an open-circuit voltage of >13V to it, it will try to hog many 10s of A of charging current. A proper charger needs to be current-limited during this phase of charging, to protect both the battery, and to protect the charger.

The ideal charger is current-limited during the initial phase of charging, and only after the battery terminal voltage reaches ~13.6+V should the charger act like a constant-voltage 13.6V power supply.

A supply like one of these makes a good battery lead-acid charger because you can independently adjust the initial charging current limit and the final voltage limit.

dataworx likes this.
5. t06afre AAC Fanatic!

May 11, 2009
5,939
1,222
Do you have a digital multimeter(DMM) capable of measuring currents in the 10 Ampere range. Depending on the battery capacity, and discharge status. It may draw more current than your power supply can deliver, or at least made to deliver. If you have a 10 Ampere DMM try to measure the charge current. If you are not 110 % sure how to do this, ask for help before trying

dataworx likes this.
6. dataworx Thread Starter New Member

Nov 27, 2013
8
0
Thanks very much for this info guys, it's starting to make sense to me now. I can't justify getting a decent charger for this project, is there a way to limit the charger current with my present setup?

7. takao21203 Distinguished Member

Apr 28, 2012
3,578
463
you need an OpAmp to amplify the voltage from a current sense shunt, and then you need to apply negative feedback to a transitor/MOSFET or even the dc/dc converter.

You can also modulate the output voltage (means to change the ON/OFF time). It is not really a good solution but you can set it so the current for the specific battery on average does not overload your powersupply.

You can even add a temp sensor to the dc/dc converter and if it overheats, simply turn it off until temperature falls again.

The result will be almost the same but a really professional battery charge circuit would measure the current and use negative feedback.

Simple car battery chargers don't have regulation, but I think the transformer is designed in a way the voltage breaks down at some point when the current is too high and there is also a bimetall interrupter for high temperature.

8. ronv AAC Fanatic!

Nov 12, 2008
3,401
1,471
Something like this should work. It should limit the current to about 700 ma. If you want more current make the 1 ohm smaller. I should be at least a 2 watt resistor.

File size:
111.5 KB
Views:
26
9. dataworx Thread Starter New Member

Nov 27, 2013
8
0
I have a Fluke 73, it's marked 10A unfused so I guess it will do. To measure the current I believe I need to plug the red lead on the DMM into the 10A socket, leave the common lead in the common socket and select DC amps on the meter. Then I need to put the meter inline between power supply and battery, I think on the negative. Not sure though whether I should put it inline before or after the voltage booster or if that will make any difference.

10. bountyhunter Well-Known Member

Sep 7, 2009
2,498
507
You need to design a charger which is both CC (constant current) and CV (constant voltage) when the battery voltage comes up to final target value. I designed one such circuit using an LM 2576 switcher for a 6V battery, but it could be adjusted for 12V battery. Note that there is a feedback loop to limit current to a fixed value and when the battery voltage rises high enough, a voltage divider provides voltage feedback to the FB terminal to lock it in there.

File size:
103.8 KB
Views:
22
11. MikeML AAC Fanatic!

Oct 2, 2009
5,451
1,066
Yes, most old transformer Wall-Warts are intrinsically current-limited to meet their certification requirements. I have built dozens of Lead-Acid Battery Maintainers starting with a 12 to 15V DC 1A Wall-Warts followed by a voltage regulator (LM7805, 7812 or PB137). Core Saturation in the transformer takes care of the current limiting during the initial charge-up phase. The voltage regulator takes care of the float voltage after the battery comes up to ~13.5+V

12. dataworx Thread Starter New Member

Nov 27, 2013
8
0
I metered the current drawn by the battery that I was trying to charge - the resting voltage was 11.9, I set the voltage booster/regulator to 13.5 and observed a current draw starting at about 1.8 amps and falling rapidly to about 850 mA at which point I switched the power supply off fearing the heat that was being generated.
Since the power supply is rated at 2 amps maximum I was a little puzzled by the heat so I opened it up, turns out the heat is generated by a component (see pic) bolted to the back of the power supply enclosure. It seems heat is not the problem, it's the poor dissipation thereof.

So theoretically, if I ignore the heat issue, I can carry on with what I'm doing? Ideally I guess I could mount whatever the offending component is (triac/transistor?/)on a good heatsink.

13. dataworx Thread Starter New Member

Nov 27, 2013
8
0
Sorry,forgot to attach the photo

File size:
19.9 KB
Views:
28
14. bountyhunter Well-Known Member

Sep 7, 2009
2,498
507
Looks like a voltage reg or power transistor. That's where the heat is.

15. bountyhunter Well-Known Member

Sep 7, 2009
2,498
507
That might improve things, but a TO-220 device has a thermal resistance of 4C/W going to the heatsink so there power diss is limited to about 15-20W maximum.

16. dataworx Thread Starter New Member

Nov 27, 2013
8
0
Thanks bountyhunter, I'll try a heatsink for good measure. Is the heat generation detrimental to this component or is it a normal state?

17. bountyhunter Well-Known Member

Sep 7, 2009
2,498
507
As Long as the junction temp stays below about 110C, the semi's life will still probably exceed 50k hours or more.

If the device is an IC regulator like an LM317, it will have internal temp limiting.

dataworx likes this.

Nov 27, 2013
8
0