Hello everyone:
I am new to these forums; however, I have been using the All About Circuits website for a long time. Its a great site. I just finally decided to join the forums to ask an important question
I apologize if this has been asked before.. or if this is in the wrong forum. My preliminary searches have not shown any results.
I am building a battery charger. The charging specifics are already figured out. However, my main concern is the charging voltage. If you read online, many sites say to regulate battery current when charging. For example.. the battery cant charge at a rate faster than C / 2. Fine. Easy. This means the current must be limited. However.. there is no mention of charging voltage.
Every charger I have come across has a charging voltage that is a little higher than the battery to be charged. So a 12V auto battery requires say a 14V charging voltage. Why?
What I gather from 'forcing current' into a battery is this. The reason current is forced into the battery is because your charging voltage is higher than the battery voltage. In other words: (Vcharge - Vbat) / Rbat = I. This means if I hook a higher voltage source to a lower voltage battery, a current will flow according to the above equation. Now.. internal battery resistance is very low, so I can essentially skyrocket to above say 5/10/20 amps very fast, even with a low voltage difference. This is why battery charging current needs to be limited.
In simplest terms, I want to charge any battery from say a 24V source. So I have several types of battery chemistries to be charged. Some are cell based, so I want to charge the battery per cell. This would entail 24V charging say a 4V cell. Using the above equation, (24 - 4) / Rbat gives me an I. The I in this case will be very high. However, its the current I am interested in, right? If I limit the current to something low for a charge C rate, will I be alright?
Judging from the equation, the only thing that really changes when you charge from a higher voltage source is the charge current. Higher voltage gives you a higher charge rate. One of the reasons I think that battery chargers only charge from a voltage they want the battery to be at is because.. once the battery to be charged reaches that level, the charging current will be cut off because the sources are now equalized. I want to use a different approach and charge from a very high source and just cut the charging current manually with a transistor or something.
The only other possibility I can think of this not working is that.. a given battery is not built to withstand say 24 volts on its terminals. Does that make sense? I can limit the charging current all I want, but if the voltage is too high, the battery wont accept it or is not designed to accept the current or something.
Any information would be appreciated.
Thanks
I am new to these forums; however, I have been using the All About Circuits website for a long time. Its a great site. I just finally decided to join the forums to ask an important question
I apologize if this has been asked before.. or if this is in the wrong forum. My preliminary searches have not shown any results.
I am building a battery charger. The charging specifics are already figured out. However, my main concern is the charging voltage. If you read online, many sites say to regulate battery current when charging. For example.. the battery cant charge at a rate faster than C / 2. Fine. Easy. This means the current must be limited. However.. there is no mention of charging voltage.
Every charger I have come across has a charging voltage that is a little higher than the battery to be charged. So a 12V auto battery requires say a 14V charging voltage. Why?
What I gather from 'forcing current' into a battery is this. The reason current is forced into the battery is because your charging voltage is higher than the battery voltage. In other words: (Vcharge - Vbat) / Rbat = I. This means if I hook a higher voltage source to a lower voltage battery, a current will flow according to the above equation. Now.. internal battery resistance is very low, so I can essentially skyrocket to above say 5/10/20 amps very fast, even with a low voltage difference. This is why battery charging current needs to be limited.
In simplest terms, I want to charge any battery from say a 24V source. So I have several types of battery chemistries to be charged. Some are cell based, so I want to charge the battery per cell. This would entail 24V charging say a 4V cell. Using the above equation, (24 - 4) / Rbat gives me an I. The I in this case will be very high. However, its the current I am interested in, right? If I limit the current to something low for a charge C rate, will I be alright?
Judging from the equation, the only thing that really changes when you charge from a higher voltage source is the charge current. Higher voltage gives you a higher charge rate. One of the reasons I think that battery chargers only charge from a voltage they want the battery to be at is because.. once the battery to be charged reaches that level, the charging current will be cut off because the sources are now equalized. I want to use a different approach and charge from a very high source and just cut the charging current manually with a transistor or something.
The only other possibility I can think of this not working is that.. a given battery is not built to withstand say 24 volts on its terminals. Does that make sense? I can limit the charging current all I want, but if the voltage is too high, the battery wont accept it or is not designed to accept the current or something.
Any information would be appreciated.
Thanks