Balencing the charging current between Solar Panel and electric battery charger.

Thread Starter

Mussawar

Joined Oct 17, 2011
95
Hi,
I have
A solar panel (with a charge controller) that can provide 6 Amp DC.
An electric battery charger that can provide 10 Amp DC from 220 V AC. Its charging current is adjustable through a potentiometer.
Both have a “Battery Full Cut off” adjustment that would not allow battery over charging.
A 12V lead acid battery 100Ah that is intended to be charged from both charger and solar panel.
So generally I need 10A charging current. Obviously I want to utilize the solar panel as much as possible. For this battery should preferably get the maximum charging current from solar panel and the remaining current from the electric charger. For example if solar panel is providing 6 Amp in bright Sunshine then the remaining 4 Amp current should be from electric charger (total 10 Amp). In the evening, if solar panel provides 3 Amp; remaining 7 Amp should be from electric charger.

Somebody please give me just an idea how I should design this circuit.
 

NorthGuy

Joined Jun 28, 2014
611
You don't need to charge your battery 24x7, do you?

Standard way of dealing with this is to let the battery charge from solar during the day, and if it is not enough, then charge it with charger during the night.
 

Thread Starter

Mussawar

Joined Oct 17, 2011
95
You don't need to charge your battery 24x7, do you?

Standard way of dealing with this is to let the battery charge from solar during the day, and if it is not enough, then charge it with charger during the night.
Actually I use this battery through an inverter when electric power is not available so battery is consumed. I want to keep battery fully charged according to above given criteria.
 

Lestraveled

Joined May 19, 2014
1,946
- If you used "or-ing" diodes to somehow sum the currents from the charger controller and the battery charger, the voltage sensing for "end of charge" would be offset by the diode voltage drop and the battery would only reach about 70% of charge.
- The other way is to make your charger act like a solar panel and diode OR it together with the solar panel before the charge controller. But your charger is probably not in the same voltage range as your panel. Also your charger controller would have to be able to handle 10amps output.

Regardless of the topology, you will have to build a feedback circuit to control the battery charger. It may be more problematical than what it is worth. Two active and smart battery chargers do not play well together charging the same battery. Look for a way to have two power (current) sources and only one battery charger.

I suggest that you consider a more conventional method. When AC is on, used the charger alone. When AC is off use the solar. One AC relay will accomplish this for you.
 

wayneh

Joined Sep 9, 2010
17,498
What I don't get is why you would run off the battery and inverter when AC is available. It's very inefficient to suffer all the losses thru the charger, battery and inverter, just to end up back at AC. If you have AC, just use it directly and use the solar charger for charging the battery backup.
 

NorthGuy

Joined Jun 28, 2014
611
It is very hard to drive 10A in the fully charged 100AH battery. This will require enormous voltage and, most likely, will destroy the battery very quickly.

How big is your load?
 

MikeML

Joined Oct 2, 2009
5,444
Assuming that you can parallel the solar charger with the AC charger with no isolation diodes between them, set the Solar charger voltage limit (open circuit) to 14.4V, and set the AC charger voltage limit to 13.8V (open circuit). If the battery is initially badly discharged, its terminal voltage will start out near 12V. This will cause both chargers to drive their maximum current into the battery. A 100AH battery should take a charge rate of 6A+10A with no problem.

When the battery voltage reaches 13.8V, it is about 90% charged. At that point, the AC charger stops supplying current, but the Solar charger keeps driving its 6A into the battery until the battery voltage reaches 14.4V (fully charged). If the sun stops shining before that happens, the battery is floated all night at 13.8V by the AC charger at low current, much less than 10A, during which time the battery will accumulate most of the missing 10% charge.... Battery should have between 95% and 100% charge by next morning.

If the two chargers require the addition of diodes to prevent one backfeeding into the other, you can still do as I say, but set the voltages higher to overcome the forward drops of the diodes...

Here is the essence of the idea. Look at the simulated battery voltage V(b+), and the currents delivered by the two chargers I(D3) and I(D4). The horizontal axis (time) is arbitrary....
 

Attachments

Last edited:

Thread Starter

Mussawar

Joined Oct 17, 2011
95
It is very hard to drive 10A in the fully charged 100AH battery.
What I meant was a 100 Ah battery needs a maximum 10Amp charging current at start. (10% of total capacity). Currently my load draws 10 to 50 Amp from battery but in future it might be 40 to 50 Amps form a 200Ah battery and with two or more solar panels. What I intended is to formulate it generally for a medium power/load system.
 
Last edited:

Thread Starter

Mussawar

Joined Oct 17, 2011
95
Assuming that you can parallel the solar charger with the AC charger with no isolation diodes between them, set the Solar charger voltage limit (open circuit) to 14.4V, and set the AC charger voltage limit to 13.8V (open circuit). If the battery is initially badly discharged, its terminal voltage will start out near 12V. This will cause both chargers to drive their maximum current into the battery. A 100AH battery should take a charge rate of 6A+10A with no problem.

When the battery voltage reaches 13.8V, it is about 90% charged. At that point, the AC charger stops supplying current, but the Solar charger keeps driving its 6A into the battery until the battery voltage reaches 14.4V (fully charged). If the sun stops shining before that happens, the battery is floated all night at 13.8V by the AC charger at low current, much less than 10A, during which time the battery will accumulate most of the missing 10% charge.... Battery should have between 95% and 100% charge by next morning.

If the two chargers require the addition of diodes to prevent one backfeeding into the other, you can still do as I say, but set the voltages higher to overcome the forward drops of the diodes...

Here is the essence of the idea. Look at the simulated battery voltage V(b+), and the currents delivered by the two chargers I(D3) and I(D4). The horizontal axis (time) is arbitrary....
Thanks. You gave a good idea. AC charger will charge up to 13.8V and stop. Solar keeps charging up to 14.4V (fully charged) and stop. I think, for a 100Ah battery, charging current should not exceed from 10A as they recommend maximum 10% charging current of battery's total capacity. I prefer to add diodes for isolation.
Well give me an opinion. What if I detect the solar current by some means and use it as feedback to AC charger to limit its current to 4Amps. When the Sun is absent (no solar current), there is no feedback so AC charger turns normal and charge the battery normally.
 

Thread Starter

Mussawar

Joined Oct 17, 2011
95
- If you used "or-ing" diodes to somehow sum the currents from the charger controller and the battery charger, the voltage sensing for "end of charge" would be offset by the diode voltage drop and the battery would only reach about 70% of charge.
- The other way is to make your charger act like a solar panel and diode OR it together with the solar panel before the charge controller. But your charger is probably not in the same voltage range as your panel. Also your charger ccontroller would have to be able to handle 10amps output.

Regardless of the topology, you will have to build a feedback circuit to control the battery charger. It may be more problematical than what it is worth. Two active and smart battery chargers do not play well together charging the same battery. Look for a way to have two power (current) sources and only one battery charger.

I suggest that you consider a more conventional method. When AC is on, used the charger alone. When AC is off use the solar. One AC relay will accomplish this for you.
Thanks. Your last suggestion is the most simple and convenient but I wouldn't allow solar panel to take even a little rest all the day. That's why I've bought it. :)
OR-ing both chargers through diodes is essential because both have different voltage and current capabilities. As I asked to MikeML in reply I'm thinking about the feedback strategy. Please have a look and recommend whether it is good?
Regards.
 

MikeML

Joined Oct 2, 2009
5,444
You should read up on the method of charging sealed lead acid batteries, and make sure that your chargers each complies with the method if used singly.

If each is a proper charger based on the description above, paralleling the two chargers (with external diodes, if needed) will work. The only caveat is that per the battery maker's data sheet, it will tolerate a maximum charge rate of 16A.
 

NorthGuy

Joined Jun 28, 2014
611
What I meant was a 100 Ah battery needs a maximum 10Amp charging current at start. (10% of total capacity). Currently my load draws 10 to 50 Amp from battery but in future it might be 40 to 50 Amps form a 200Ah battery and with two or more solar panels. What I intended is to formulate it generally for a medium power/load system.
It's a big load for the battery. What is a nature of this load. Does it last few minutes, an hour, longer? How long are periods when the load is off?
 

ErnieM

Joined Apr 24, 2011
8,377
A 50 amp load off a 200AH battery will drain it quick. Unless you have an unlimited supply of cheap replacement batteries you should limit the draw to at most 1/4 the capacity, less is better. The USCG pegs their daily usage as 1/30th.

So don't run your 50A load for more then an hour between charges.
 

MikeML

Joined Oct 2, 2009
5,444
Here is a pdf for a typical 100Ah SLA. It states right on its data sheet that the maximum allowable charging current is 20A. It also shows discharge times vs various types of loads. The OP should read and understand this data sheet for a reality check about his intended application.
 

Thread Starter

Mussawar

Joined Oct 17, 2011
95
If each is a proper charger based on the description above, paralleling the two chargers (with external diodes, if needed) will work. The only caveat is that per the battery maker's data sheet, it will tolerate a maximum charge rate of 16A.
AC charger is microcontroller based and do well as required above. Solar has no specific arrangement but it would decrease current competitively with the rise of battery voltage.
 

MikeML

Joined Oct 2, 2009
5,444
AC charger is microcontroller based and do well as required above. Solar has no specific arrangement but it would decrease current competitively with the rise of battery voltage.
So the solar charger has no regulator at all? Or does it have a constant-voltage regulator with no step-down of voltage after the battery current decreases? If you were going to use the solar charger alone, what final voltage would it raise the battery to?
 

Thread Starter

Mussawar

Joined Oct 17, 2011
95
Here is a pdf for a typical 100Ah SLA. It states right on its data sheet that the maximum allowable charging current is 20A.
I'm sorry but pdf page not found. I'm not a battery specialist but generally I've read in many articles that a lead acid should have an average charging current of 10% of it total capacity. Any way, actually what I want is to get a maximum 10A charging current combined from both sources.
 

Thread Starter

Mussawar

Joined Oct 17, 2011
95
So the solar charger has no regulator at all? Or does it have a constant-voltage regulator with no step-down of voltage after the battery current decreases? If you were going to use the solar charger alone, what final voltage would it raise the battery to?
Solar has a charge controller that only disconnect charging when battery voltage reach at a specified value.(say 14.4V and it is adjustable).
 

Thread Starter

Mussawar

Joined Oct 17, 2011
95
A 50 amp load off a 200AH battery will drain it quick. Unless you have an unlimited supply of cheap replacement batteries you should limit the draw to at most 1/4 the capacity, less is better. The USCG pegs their daily usage as 1/30th.

So don't run your 50A load for more then an hour between charges.
50 Amp is the peak load current and may exist for a short period of time (for 2 to 10 minuets). Average load current is approx 20 +- 5A. It won't drain out as much quickly because solar is providing 6Amp constant charging. So if the load sucks 16Amp, battery discharge rate would be just 10Amp. Won't it?
 
Last edited:

MikeML

Joined Oct 2, 2009
5,444
I'm sorry but pdf page not found. I'm not a battery specialist but generally I've read in many articles that a lead acid should have an average charging current of 10% of it total capacity. Any way, actually what I want is to get a maximum 10A charging current combined from both sources.
Try this link.
What you read on the internet and what you read on the manufacturer's data sheet are usually two different things...
 
Top