9V battery charger

wayneh

Joined Sep 9, 2010
17,496
The op-amp (which should be replaced with a comparator) watches the battery voltage and turns off charging (and turns on the red LED) when the voltage exceeds the setpoint. That's not the greatest strategy for every battery chemistry. What type of battery are you charging?

What is IC2?
 

Thread Starter

emilj726

Joined Oct 1, 2010
56
What supply voltage should I use? 9V?
230mAh battery is 230ma for charge current ok?
The charge current depends on R10 correct?
 

t06afre

Joined May 11, 2009
5,934
It would perhaps be more simple using a LM317 as constant current source. And a timer switch. I am not sure if your battery can tolerate fast charging. And I could not find any info about it. If the charge-current is 1/10th the ampHr rating of the cell. You should be safe as a rule of thumb. At this charging current your battery should be fully charged after 12 to 14 hours (dependent on drain state). Letting it charge longer may damage the battery. The best thing would probably be use a dedicated charger IC. They would probably cost you less then building the schematic you showed us.
 

Thread Starter

emilj726

Joined Oct 1, 2010
56
Yeah using the LM317 would be easier but I had all of the components except Q1(I will use something else for that) in my junk box so I started building it.
Any suggestions for Q1?

Plus I have never designed anything with CD4011 so I figured this would be a great chance for that.
 

Thread Starter

emilj726

Joined Oct 1, 2010
56
So when output of the comparator is high IC2B is acting as a one shot, hence Q2 is now biased via R12 and R13 turning on Q2.
R11 is now connected to ground hence monitoring the voltage across the battery? Am understanding that correctly?

What is the purpose of R8?

If anyone can verify this It would be greatly appriciated.
 

wayneh

Joined Sep 9, 2010
17,496
Charging NiMH by voltage control is not a good strategy, since the voltage peaks and begins to drop during overcharge. The better commercial NiMH chargers use ∆V, I believe, meaning they watch the slope of the voltage versus time. When that becomes low or negative, charging is stopped.
 

bertus

Joined Apr 5, 2008
22,270

Attachments

thatoneguy

Joined Feb 19, 2009
6,359
If you don't want to kill your battery by improper charging, I'd suggest building a circuit out of a charge controller IC that monitors ΔV and ΔT.

The Maxim DS2711 would give you the features you want at the same time as simplifying your circuit. See Figure 3 on page 6 of the PDF Datasheet just linked. The DS2711 can work with NiMH or NiCD batteries. If you plan on charging LiIon batteries, you will need a much more advanced charge controller and protection circuitry.

Optimum charging monitors both voltage and temperature. The most common is "Delta V", when the voltage over time reverses slope. To add faster response, "Delta T" is also added, which detects when the temperature starts to rise rapidly over a given time, indicating full charge.

If you use only ΔV, if a cell is weak, it will be killed if the charger waits for the slope to to go negative, while a ΔT circuit will charge the battery to maximum capacity without overheating it.

Batteries should NOT get too hot to hold you finger on for 5 seconds during a charge cycle if you expect to get the maximum lifetime from them. Some warming always present, due to the inefficiency of battery chemistry being emitted as heat.

--ETA: The above, less the charge control IC, is mentioned in the PDF linked a few posts up.
 
Last edited:
Top