Battery Charging

Thread Starter

LostTime77

Joined Dec 9, 2010
30
Hello everyone:

I am new to these forums; however, I have been using the All About Circuits website for a long time. Its a great site. I just finally decided to join the forums to ask an important question :p
I apologize if this has been asked before.. or if this is in the wrong forum. My preliminary searches have not shown any results.

I am building a battery charger. The charging specifics are already figured out. However, my main concern is the charging voltage. If you read online, many sites say to regulate battery current when charging. For example.. the battery cant charge at a rate faster than C / 2. Fine. Easy. This means the current must be limited. However.. there is no mention of charging voltage.

Every charger I have come across has a charging voltage that is a little higher than the battery to be charged. So a 12V auto battery requires say a 14V charging voltage. Why?

What I gather from 'forcing current' into a battery is this. The reason current is forced into the battery is because your charging voltage is higher than the battery voltage. In other words: (Vcharge - Vbat) / Rbat = I. This means if I hook a higher voltage source to a lower voltage battery, a current will flow according to the above equation. Now.. internal battery resistance is very low, so I can essentially skyrocket to above say 5/10/20 amps very fast, even with a low voltage difference. This is why battery charging current needs to be limited.

In simplest terms, I want to charge any battery from say a 24V source. So I have several types of battery chemistries to be charged. Some are cell based, so I want to charge the battery per cell. This would entail 24V charging say a 4V cell. Using the above equation, (24 - 4) / Rbat gives me an I. The I in this case will be very high. However, its the current I am interested in, right? If I limit the current to something low for a charge C rate, will I be alright?

Judging from the equation, the only thing that really changes when you charge from a higher voltage source is the charge current. Higher voltage gives you a higher charge rate. One of the reasons I think that battery chargers only charge from a voltage they want the battery to be at is because.. once the battery to be charged reaches that level, the charging current will be cut off because the sources are now equalized. I want to use a different approach and charge from a very high source and just cut the charging current manually with a transistor or something.

The only other possibility I can think of this not working is that.. a given battery is not built to withstand say 24 volts on its terminals. Does that make sense? I can limit the charging current all I want, but if the voltage is too high, the battery wont accept it or is not designed to accept the current or something.

Any information would be appreciated.

Thanks
 

Kermit2

Joined Feb 5, 2010
4,162
a current limiting charger would work for multiple battery configs. By adjusting a supply to give, say 1/2 amp. The voltage will adjust itself to the amount required to deliver that current. So putting your 24 volt constant current charger on a twelve volt battery would result in the charger lowering its voltage to match the current limit for that battery.

This would be hard to implement(expensive) for large batteries, but for small ones it will work just fine.
 

Thread Starter

LostTime77

Joined Dec 9, 2010
30
Thanks for the response.

This site gives information that I already know. It states that a battery cell should be charged at such and such voltage. For lead acid, 2.4V per cell. What I want to know is the technical reason behind such a charging voltage. What is stopping me from charging a dead cell, say 2V from a source of 24V and just monitoring when that 2V cell goes up to full voltage charge (2.36V) as long as the charge rate is limited (current limited)?
 

SgtWookie

Joined Jul 17, 2007
22,230
Charging and discharging batteries involves chemical reactions. It takes time for the chemical reaction to take place, and keep the temperatures at a safe level.

If you attempt to charge or discharge a battery faster than it's design limits, it will overheat and possibly rupture forcefully.

Heat causes increased chemical activity, and also tends to decrease the service life of a battery.
 

Thread Starter

LostTime77

Joined Dec 9, 2010
30
In response to Kermit:

I am not sure what you mean by the charger would lower its voltage to accommodate the battery. Essentially, the charging voltage would remain the same at 24V. The circuit takes as input 24VDC into a constant current regulator to limit maximum charging current. Then I use a BJT to limit actual charging current by only allowing a certain amount of current to flow to the battery being charged. Voltage basically stays the same. I terminate the charge current via the BJT after the battery voltage reaches the proper level.

I am not implementing just a straight BJT like this in reality. It is PWMed via microcontroller; however, the above explanation is used to point out what my thinking is.

Thanks
 

Thread Starter

LostTime77

Joined Dec 9, 2010
30
In response to Wookie:

I am not attempting to discharge or charge the battery faster than it is designed to be. I am simply charging from a very high voltage source and limiting the current. I do realize that if you charge from a higher voltage source, the charging current would increase by a lot as per I = (Vcharge - Vbat) / Rbat, if the source was unregulated current, but I am regulating the current.
 

SgtWookie

Joined Jul 17, 2007
22,230
An ideal current source has infinite impedance, as an ideal voltage source has zero impedance.

If you use a linear regulator to charge a 2v battery from a 24v source, most of the power will be dissipated in the regulator rather than being used to charge the battery.

A switching "buck-type" regulator can effectively limit current fairly efficiently over a wide range of input voltages.

However, the charging technique varies widely depending upon battery chemistry and construction, as you probably already know.

There is also the proper charge rates for variations over temperature, which many don't bother to cover.

Take a look at this post: http://forum.allaboutcircuits.com/showpost.php?p=262143&postcount=38

Have a look at the table generated by the spreadsheet. If you have MS Excel installed, you can download the spreadsheet and experiment with it.
 

Thread Starter

LostTime77

Joined Dec 9, 2010
30
Thanks for the info SgtWookie. I will experiment with that excel sheet.

So what you are saying is that it is indeed possible to charge a 2V cell from say 24V as long as it is current limited, because batteries essentially just care about charge rate / discharge rate, correct? Regardless of the type of regulator I use to limit the maximum current, whether it be a buck type or linear regulator, I believe I can deal with the power losses. I essentially don't care a whole lot about efficiency either. I just want to make sure the high voltage wont somehow damage the batteries.

The reason I am using the current limit model over the voltage model is because in my design, it is very hard to regulate the charging voltage. I can however regulate the current very simply. Additionally, I can measure the battery SOC (current battery voltage) very simply. I realize that charging lead acids using just the charge limit model is probably not a good idea. For one, you usually use a 3 stage charger for that. While I might be able to jerry rig something to work correctly for lead acids, I don't plan on charging many lead acids. I was just using the lead acid as an example, because many are familiar with it.

Thanks
 

SgtWookie

Joined Jul 17, 2007
22,230
Sure, it's possible to charge a low voltage battery from a high voltage source.

But, let's just say you want to charge a hypothetical 4v NiCD battery at a 100mA rate from a 24v source using a linear regulator.
Power dissipation in the battery will be ~400mW, and power dissipated in the regulator will be 2W. That's do-able with something like an LM317 and a good-sized heat sink. Besides, at this time of year it's nice to have some extra heat in the room.

You need to limit the current until terminal voltage is reached.

In most chemistries, you also need to watch the temperature.
 

Kermit2

Joined Feb 5, 2010
4,162
In response to Kermit:

I am not sure what you mean by the charger would lower its voltage to accommodate the battery. Essentially, the charging voltage would remain the same at 24V. The circuit takes as input 24VDC into a constant current regulator to limit maximum charging current. Then I use a BJT to limit actual charging current by only allowing a certain amount of current to flow to the battery being charged. Voltage basically stays the same. I terminate the charge current via the BJT after the battery voltage reaches the proper level.

I am not implementing just a straight BJT like this in reality. It is PWMed via microcontroller; however, the above explanation is used to point out what my thinking is.

Thanks

I think you'll find I'm right. Hook your charger up to the power source and you get 24 volts with the meter. (Current limited charger) Connect to a depleted 12 volt battery and current will flow up to the max. set for the limit. Using your meter, check that battery and see if it has 24 volts on it or something more like 13 or 14 volts. :) If the charger goes into 'current limit' the output voltage will be LOWER than 24 volts.

I'll wait while you check it out. I'm patient that way.
 

Thread Starter

LostTime77

Joined Dec 9, 2010
30
No, no, I get what you are saying Kermit.

I just needed that clarification. I believe you. Overall, my question has been answered, so I am happy.

Thanks for the help Kermit and SgtWookie.
 
Top