Solar Panel Dilemma

Thread Starter

Wingsy

Joined Dec 18, 2016
86
SolarPaneltest.png
Ok, here's my dilemma. (This is my lighthouse project, if anyone is keeping up with that.)

I have a solar panel charging a battery. At night the processor drives 16 white LEDs, only 3 on at a time. I have searched the entire world for a solar panel that will fit in the available space and have found only 1 that supplies anywhere near the power I need. It's rated at 5v, 200ma (possibly true at noon in the Sahara but not here). In my project the panel must lay flat (red line on the graph). I've tested this solar panel quite thoroughly to see just what I can get out of it. It seems well suited for charging a Lithium cell (4.2v peak) but not so good for charging 4 NiMH AA cells (5.8v peak). Trouble is, I can't drive white LEDs with a lithium cell - not quite enough voltage. The 4 AA cells would work fine, but I can't charge that with this panel.

So, I can use a DC-DC converter to bump the lithium voltage up to, say, 5v, to run the LEDs OR I can use a converter to bump the panel voltage up to 6v to charge the 4 AA cells. Right now I'm inclined to use a capacitive charge pump to up the panel to 6v and use NiMH batteries rather than a lithium. (Lithiums are oh so finicky about what you feed them.)

What would you do, and why?
 

wayneh

Joined Sep 9, 2010
17,498
Every solar light I've ever looked into, cheap or not, had a panel that gave an open-circuit voltage roughly double the battery voltage. They all use a boost after the battery to light the LEDs. I've never seen one set up to boost voltage to charge the battery. So I'd be tempted to follow that general layout scheme by using a lithium or 2 AAs and then use the DC-DC converter to supply the LEDs.

Nice data, by the way. It's rare we see real data obtained under relevant conditions.
 

Thread Starter

Wingsy

Joined Dec 18, 2016
86
I think I'd rather use lithium than 2 AAs.

The LEDs will pull 40ma total at maximum brightness (and I don't know yet how much, if any, I can reduce that brightness until I see it working). That's going to suck a lot of current from 2 AA cells, and I was shooting for a run time of something like 3 night's operation from a full charge (1350mahr for the 4 AA). If I boost the panel to 6v with a converter and use 4 AAs then I don't think I would get a charge current above 100ma on a good day. 100ma times 8 hrs of sun is only 800mahr, good for around 20 hrs of LED time - almost 2 nights, and that's assuming the battery was depleted prior to that 8 hr of charge. I would be disappointed if I got much less than this.

So if I use a lithium I can charge directly off the solar panel, with a cutoff circuit to keep the lithium from going over 4.2v. To charge a lithium properly I would need to maintain a certain level of current after reaching 4.2v, which means monitoring the charge current in addition to the voltage. More parts, more s/w. So maybe I'll charge it improperly. I'm going to test a lithium to see just how much capacity it has if charged to 4.2v and then disconnected from the charger. It should never get to a full charge but that may be ok if I use a 2.5AHr cell, and I have several of those. But using a lithium means I'll need more than 40ma to drive the LEDs, since the DC-DC converter isn't 100% efficient, BUT... offsetting this is the fact that I'll get more than 100ma charge current from the panel when charging the lithium.

I think my head may explode.

Place your bets... how much of full capacity would I get from a lithium charged to 4.2v and then stopped charging?
 

wayneh

Joined Sep 9, 2010
17,498
Unfortunately the "charge to a voltage" strategy isn't so great for lithium. That said, I bet you'll get a decent result. It may overcharge itself eventually. You could use a battery tending IC but I'm not sure the cost and efficiency loss would justify it, compared to just replacing the battery a little sooner.
 

Thread Starter

Wingsy

Joined Dec 18, 2016
86
LithiumCharge.png
Looks like a lithium cell would charge to about 75% capacity if charged to 4.2v and then disconnected, omitting the constant voltage top-off charge at the end. For my 2.5AHr cell it would charge to a max of 1.8AHr. (That graph is for a 2AHr cell.)

Assuming a nominal cell voltage of 3.6v and a 5v converter, a 40ma load at 5v would draw .2W from the cell, or 56ma. At 80% efficiency of the converter that increases the draw to 70ma. Using a 10-hr night (summertime), 70ma for 10 hours would pull 700mAHr per night. At my full charge of 1.8AHr it would run for about 2-1/2 nights. If I can get 8hrs of good sun for charging (~135ma) I should be putting around 1AHr into the cell per day, enough for about 1-1/2 nights. So for 2 days of good sun I save up enough unused energy to run for 1 night without sun that 3rd day.

Now complicate all this by the fact that many days I would get partial sun, meaning there is such a huge variable in play that I have no good feel for how well this is gonna work. If I only had a larger area for the solar panel I would use a bigger one and just overkill it. Mmmmm.

I believe the choice between a lithium with a converter to the LEDs and direct charging from the solar panel vs a NiMH direct to the LEDs with a converter between the solar panel and NiMH, is a toss up. Especially after seeing that I can get 75% capacity from the lithium when charged to 4.2v and stoping without the need to watch for overcharging.

Edit: Reading that graph a little more closely and it appears the lithium would charge to 85%, not 75%.
 
Top