Charging battery with slightly higher voltage than normal

Thread Starter


Joined Oct 10, 2014
I know charging a battery with too high a voltage isn't good for it but what I'm proposing would only be for a short amount of time and with a slightly higher V than normal. The battery I'm looking to charge is a standard car lead acid battery and the voltage I'm looking at running would be 17.5v max which is about 20% higher than the charging/running V of 14.4 that an alternator puts out.

I would suspect that the charging time would only be about 5-10 mins, maybe 15 mins max, at this voltage.

A voltage converter could be used to drop the V from it's normal V down to ~14.4 but that would waste a fair amount of energy as heat and in this situation wasting energy is not desirable so keeping it at about 17.5v would eliminate the need for the converter and I'm guessing it would charge slightly faster. A low voltage cut-off circuit is used on the power source.

If the battery charges faster with the higher V. The energy that goes into the battery, let's say 17.5V @ 10A = 175watts where charging at 13.8 @ 10A would give 138watts. If the battery is very low in charge, will it store this excess of 37watts or would that excess be lost as heat? I'm not sure how batteries work when they get higher V's than the cell can store. I'm suspecting that it would store the energy in this case, but IDK.

Thread Starter


Joined Oct 10, 2014
The battery will draw a lot more current at 17.5V than at 14.4V.
This what I don't get. Is it because the internal resistance of the battery isn't enough for the voltage? Does that just mean that the source battery will drain much faster?

What would happen if a constant current constant voltage converter were used to limit amperage to 10A and V to 17.5. I would think that would protect the source battery but how would it effect the destination battery, the one being charged?


Joined Jun 4, 2014
The battery being charged looks like a 'fixed' voltage in series with a resistor. If it is say half charged that resistor has a fairly low value. Effectively any voltage above that 'fixed' voltage appears across that low resistance and determines the current.

Let's say the 'fixed' voltage is 13.5V and the resistance is 0.1Ω. At 14.4 V the current would be (14.4V - 13.5V) / 0.1Ω. That is 9A.
At 17.5A the current would be (17.5V - 13.5V) / 0.1Ω which is 40A.

If you connect a charger which limits the maximum voltage to 17.5V and a maximum of 10A to that battery the voltage would be a little over 14.4V (14.5V) and the current would be 10A.


Joined Oct 8, 2011
Charging at elevated voltages is OK for very short periods but a lot depends on the temperature of the battery. That is why many modern vehicle charging systems, use a temperature sensor on the battery. This allows the alternator to charge at a higher voltage when the battery is cooler, e.g. on LIN based charging systems. Very often AGM lead acid batteries are also used in these applications.
A lot depends on the capacity of the battery, the amount of current the charger can supply and how far you deviate from the ideal charge rate of 10% of the A/h capacity of the battery.
Trying to force charge at an elevated voltage into a battery will generate a lot more heat (due to internal resistance & increased current) and you will be better off to create that heat outside the battery, by putting an appropriately rated 1Ohm resistor in series. This will limit the current to about 17A at most,and give you a nice tapered charge current as the cell voltages rise. Otherwise you may create excessive heat inside the individual cells, where it will only serve to increase the internal resistance and create even more heat in a never ending cycle.
You will not get to store a lot more charge in a battery of a given A/h with a higher voltage, the battery is an electrochemical cell , not a capacitor! A slightly higher voltage has one major advantage in that it will help to equalise the charge level of the individual cells.
Last edited: