# Which is more important when charging 12 volts 100 AH lithium battery more current or more voltage

#### mrel

Joined Jan 20, 2009
185
When charging 100 AH lithium battery.
Have two solar panel 100 watt each 12 volts each.
Do need more current or voltage ,if hookup solar panel parallel would give more current or should give more voltage connect in series when charging the lithium battery.
El

#### Ian0

Joined Aug 7, 2020
6,667
You must charge the battery at the specified voltage. If you exceed that voltage it will catch fire. If your charger voltage is lower, it will not charge.

#### Externet

Joined Nov 29, 2005
1,957
What is the the data sheet for that battery ? How many cells are in each solar panel, or where is its data sheet ?

Without that information, panels in parallel will recharge the battery in less time. Panels in series can damage the battery that is supposed to receive about 14V for recharging unless there is a management circuit built-in.

Last edited:

#### BobTPH

Joined Jun 5, 2013
6,065
For charging a battery of that capacity, you absolutely need a charge controller to charge it in a reasonable time safely.

#### Tonyr1084

Joined Sep 24, 2015
7,191
You ask which is more important. BOTH are important. Proper voltage AND proper current.

Lithium batteries have a specific charge profile that must be followed. Increasing one or the other is a sure way to increase failure. Decreasing one or the other may mean a much longer charge time or even a failure to charge. To charge them properly they need to be charged - um - properly.

#### Irving

Joined Jan 30, 2016
3,181
As said, you must use a MPPT solar charger; they can be had relatively cheaply on eBay or Amazon. The charge controller will usually want the panels in series to give a nominal 24v at 8A at maximum sunlight. It will track the maximum power point for the panels to get the best out of them. It will output, typically, around 13.5v at a constant 12A approx until the battery voltage reaches approx 12.6v (3 x 4.2 for Li-Ion) and then remain at a fixed 12.6v until the current drops to around 1A and then shut off until the voltage drops below 12.3v or so. The battery will take approx 9.5hours to charge from empty.

#### bassbindevil

Joined Jan 23, 2014
628
What Irving said, although I'll add that "12 volt" panels are usually around 20V open-circuit, and about 18V when loaded to maximum power point. So you're best off running the panels in parallel to get more current. MPPT charge controllers are much more expensive than PWM chargers, and the efficiency improvement isn't huge (roughly 10%). The most important thing is to find a charge controller that matches your battery chemistry.

#### MrSalts

Joined Apr 2, 2020
2,597
Most important is voltage but controlling voltage is the key, not maximizing it.

#### Audioguru again

Joined Oct 21, 2019
5,410
The very high Ah battery might be a new lower voltage LifeP04 type.
Its charging details are probably printed on it.

#### Attachments

• 316.3 KB Views: 3

#### kaindub

Joined Oct 28, 2019
112
here's my opinion. It differs from some of the other replies. But I have been using lithium for a long time.
Batteries are charged initially at constant current. You set the charge current and away it goes. Once the battery reaches its maximum charge voltage, the current is reduced (automatically by the charge controller) and this continues for some 30mins to an hour (its called the top up charge). Any more charge and the cells start to degrade.
You can build your own controller or they are available cheaply online.
If you connect the panels in parallel use a simple PWM charge controller. There is little difference between PWM and MPPT controllers in this range in terms of charge time.
With panels in series use an MPPT controller. The controller will suck the most power available from the panels and convert it to a lower voltage and higher current. Series panels and an MPPT controller will charge the battery the quickest.
Also series panels will start charging earlier in the day.

#### Ya’akov

Joined Jan 27, 2019
6,853
It might help to consider the relationship between voltage and current described by Ohm’s Law.

$$I = \frac{E}{R}$$

This means that current (I) is directly proportional to voltage (E), and inversely proportional to resistance (R).

When you first start to charge a depleted lithium ion cell (or any secondary, that is rechargeable cell) it it has a very low resistance at the charge voltage (4.2V, generally). So, if you were to apply 4.2V from the start, the current would be much more than the battery can handle, it would heat up and bad things would happen.

So, the charging strategy for LiIon cells is called constant current / constant voltage. That is, first the charger applies enough voltage to reaching the charging current limit. This is usually 1A, but it can vary depending on the design of a battery.

At first, this voltage will be relatively low, something over the current OTV (Open Terminal Voltage) of the cell or battery. For a LiIon cell, this can be no less than about 2.5V. Below this there is a danger of the cell forming internal shorts and when charging such a cell is attempted it can lead to a really good YouTube video if you have your phone handy (though I would grab an extinguisher to put out whatever was lit on fire by the flamethrower formally know as a battery—but hey, choose your priorities.)

So at 2.5V it is only going to take a few more tenths of a volt to get 1A of current. That’s why the start of charging is about managing current—hence, constant current, that is, keeping the current to 1A by adjusting the voltage. But, as the cell/battery approaches the 4.2V per cell charging limit, maintaining 1A of current would mean overcharging. You would have to use a much higher voltage than the cell can handle to maintain that 1A.

So, at some point in the charge cycle (i.e.: when 4.2V produces less than 1A into the cell), the charger switched strategy and starts to control the voltage. It will keep 4.2V into the cell and the current will diminish as the cell’s resistance increases. At some point, 4.2V will result in 0A (or near to it as makes no difference) and the charging stops.

The fully charged cell won’t stay at 4.2V for long, that’s jut the charge voltage. Over the life of a charge, the average voltage will be about 3.8V, that’s why they call it a 3.8V cell even though it is charged to 4.2V.

So, to answer your question, “which is more important voltage or current?”, the answer is “yes”. That is, voltage and current are inextricably linked. the formula $$P = I \times E$$ explains that the product of current and voltage is power (P) which is expressed in Watts. The amount of power you can put into the charging cell at any one time will vary, and so the relevant parameter—voltage or current—will be determined by the state of the cell, it isn’t fixed.

As an aside, LiIon cells will have a longer useful life if you don’t charge them fully or discharge them fully. So, if you specify a cell with a capacity, say, 20% more than your application requires and never charge it past that 20% lower state; and you don’t discharge it fully so that the charge cycle is less “deep”, the cell will have a considerably longer life.

Unlike other chemistries, there is no downside to charging a LiIon cell as often as you can manage. There is no “memory effect”, and shallower charge cycles mean much longer service life.

#### Audioguru again

Joined Oct 21, 2019
5,410
A "12V" Li-Ion battery has 3 cells and is 11.1V at half-charge, 12.6V when fully charged.
A "12V" LifeP04 battery has 4 cells and is 12.8V at half-charge, 14.4V when fully charged.

#### MrSalts

Joined Apr 2, 2020
2,597

#### MisterBill2

Joined Jan 23, 2018
13,704
IFyou know the exact charging voltage required by the battery and IF you know the maximum allowable charging current and IF maximum efficiency is not a primary requirement, THEN you can use a voltage regulator to set the charge voltage and a series resistor to limit the current. The circuit will work, but not be nearly as efficient as the more complex systems.