Charging a 12V battery and using it

Thread Starter

ShockBoy

Joined Oct 27, 2009
186
I'm curious, ultra basic question for most of you out there, but when putting 14.4 volts on a 12 volt battery; and also connected to the battery is let's say a string of 12 volt led lights. I would think that the 14.4 volts would be sent to the light string and destroy it. Is this not the case, and if not, would someone explain this to me?
Thanks All.
 

retched

Joined Dec 5, 2009
5,207
The LEDs are CURRENT controlled devices not VOLTAGE.

So the 14.4v to the battery wont do much without any current, so thats why you use CURRENT limiting resistors to keep the LEDs from going 'POP' ;)

If you design your LED circuit to operate from all ranges of voltages pertaining to the battery use and charging, by (for instance) using a 7812 voltage regulator, it will only allow 12v to get to the rest of the circuit, and the resistors will keep the LEDs from drawing too much CURRENT.

Basically what your doing is MAKING a unregulated power supply.

You need to regulate it prior to your LED circuit (or any circuit in this case)
 

Thread Starter

ShockBoy

Joined Oct 27, 2009
186
I'm not necessarily making a circuit, I have string led's that usually run off a wall wort hooked up for emergency backup lighting. So a 12 volt tv, or a DC to AC inverter have the voltage regulator built in it to account for the charging cycle? If I wish to directly connect something up to the battery I will need to run it through the 7812 first?
 

retched

Joined Dec 5, 2009
5,207
If you read the specs on the inverter, it will probably say something along the lines of
INPUT 10v - 16v DC.
OUTPUT 110v - 125v AC

So they can handle the swing.

Many are designed to be used with solar systems, so they are charging whenever there is sun and when a cloud (or night) happens by, the charging voltage and current slows or stops.

This allows for the inverter to be used day or night regardless if the battery is being charged or not.
 

_Maxi

Joined Oct 21, 2009
4
I think he is asking this:



Why don't the leds get burned? if they are supposed to work with 12v, instead of 14.4v. Do the 14.4v reach the leds? if not, why not?

I have the same doubt hehe. Bye!!
 

Attachments

Bosparra

Joined Feb 17, 2010
79
The forward voltage over a diode stays constant. For led's this is anything from 2.2V to 3.9V, depending on the type of led. Because the voltage over the diode will stay constant, the current will increase significantly with an increase in voltage. This increase in current is what will eventually destroy an led. To this end a current limiting resistor should be used to limit the current. The led never actually 'sees' the input voltage.

A more sophisticated led driver circuit will actually use a current regulator, that keeps the current constant over a very wide voltage range.
 

Thread Starter

ShockBoy

Joined Oct 27, 2009
186
Basically, how do you charge a battery and use that same battery at the same time? Can someone explain the process? Thanks.
 

Bosparra

Joined Feb 17, 2010
79
The same way it happens in a car, no real issue about it. Off course, you have to put more current in than your taking out.

I enjoy camping allot and I've got a 12V battery with charger in my off road trailer. The battery powers a freezer and lights, while the battery is being charged. The charger is capable of supplying 12A while the freezer draws 8A at full tilt.

I suspect that your real question is about the 14,7V voltage when the battery is being charged? The secret lies in the voltage regulator inside whatever equipment you'r powering off the battery. If the device is sensitive about an accurate voltage, it is garunteed to have a voltage regulator built in.
 

retched

Joined Dec 5, 2009
5,207
Regulate the circuit that is using the power.

I explained it earlier in post #2.

The regulator like a 7512 will work so If it gets some of the charging voltage, it will still output 12v to the circuit.

All car circuits are being using a battery that is being charged while the car is running. There is no 'trick' to it.

Regulate.

You basically build the circuit so it will work off the charger voltage and/or the battery voltage.

So it will operate from ~12vDC to ~15vDC. The battery acts as a buffer.
 

Thread Starter

ShockBoy

Joined Oct 27, 2009
186
That explains it. Thank You all very much. Regulate voltage and assume that pre-manufactured items made to run on 12v already have that regulator built in. I would also assume that 12v chargers/controllers all charge at the same rate, give or take.
 

retched

Joined Dec 5, 2009
5,207
The charge rate doesn't matter. The charger will shut off when the battery is charged.

If you are using a 10 amp charge rate on a car battery, you can use a single led and resistor on the battery.

When the battery reached a full charge, the charger will drop to a top-off, or trickle charge of an amp or so. The led wont see a difference.

The circuit connected to the battery will only DRAW the current it needs. The charger wont force feed it.

Thats how you can take a 3v 1ah battery and a resistor and light an led, then you can take that same led and resistor and connect it to a 3v 500ah battery and it will glow the same.

A running car almost NEVER has 12.0v in the battery. If it does, there is a problem.
When you buy those 12v LED blinkers for a car, they actually run off of 9v to 15v.

The voltage from the alternator changes with engine RPM. So the devices made to work on a car are designed to be able to use that varying voltage.

If you had a fuse that was designed to blow at 12v exactly, and attached it to a car circuit, it WILL blow.

And since the 12v battery will not charge with 12v, the alternator HAS to output ~14.4vDC. That means every circuit in the car will be receiving ~12v with the car off, and ~14 with the car running.

So to sum it up, the charge rate of the charger, ONLY will affect the charge time of the battery. Not your circuit.
 

Thread Starter

ShockBoy

Joined Oct 27, 2009
186
I'm sorry. When I said rate, I was referring to voltage rate. But it sounds like I need either a regulator or resistor for my led lighting (in my garage). It is connected directly to the battery at this time, and have yet to hook up the solar panels through the controller.
 

retched

Joined Dec 5, 2009
5,207
YES, yes, three times yes.

LEDs are CURRENT controlled devices, not voltage.

Therefore, as they heat up, they will start drawing more current causing them to heat up even more...and so on until >POP<.

This is called thermal runaway. The resistor is a 'current limiting resistor'.

So regardless of the changes in the temperature, and the current it WANTs to draw, the resistor will only let it draw the value you choose.

There is a point in every LEDs life where brightness and reliability meet. You could throw a little more current and get 1% more light, but kill the LED 50% faster.

The datasheets usually have a "happy" range for the leds. Typical is 20ma.

Using OHMs law, with a 12v battery and a LED with a voltage drop of 3.2v

12v battery - 3.2Vf (forward voltage) = 8.8v
8.8v x .02 (20ma) = 176ohms

176 is not a standard resistor value, so rounding up to the next value is 180.

so a 180ohm resistor in series with you led, with a 12v battery will allow the LED to draw around 20mA. (actually the math gives 20.454545mA with a 180ohm)
;)
 

Thread Starter

ShockBoy

Joined Oct 27, 2009
186
Thank You So Much! :) That helps a lot for me. I will have to figure out what the current needs are for my led's, they come in 15' rolls. 85 bucks wholesale but bright as hell. They light up a room nicely. There are micro resistors built into the strand, but the wall wort at 12v and 500ma for the whole strand, and I have pieces of the strand,,, O dear. I will experiment on a 10ohm and 100ohm resistor as soon as I hook up the panels to the controller to the battery.
 
Top