Why do three 5mm 3.2v 20mA LEDs connected in parallel to 2 AAA drain them in just a few hours?

Thread Starter

xbonet

Joined Nov 19, 2018
10
I have 3 warm white LEDs running off 2 AAA batteries. I know I should always use a resistor for LEDs but I'm rather new to electronics and I wanted to understand what would happen in practice. I mean, each of these LEDs has a listed forward voltage of 3.2-3.5, so the 3v from the 2 AAA batteries would be fine, and no resitance would be needed to drop any remaining voltage. But, of course, I'm not accounting for current. I guess having a resistor of small value, even if it means losing some voltage and, therefore, light intensity, would be recommended in order to control de current flow. But I still don't understand why. If each LED draws 20mA and AAA batteries are listed as having a typical drain of 10mA (which I'm reading as being somehow regulated, meaning they can't just offer whatever current), should't this circuit be already OK without any resistors? Wouldn't they just waste energy?

In practice, after testing this empirically some days, I notice the batteries get warm (not too much but, still, enough to notice), so I'm assuming it's because the LEDs are drawing a lot of current, perhaps even the 20mA they need (can they draw more?). But something weird is happening (at least something I think is weird, though you make see it as obvious because you understand more about this than I): my new 2 AAA batteries last less than 4 hours lighting up the 3 LEDs before these get really, really dim... (N.B. The LEDs aren't burning out, just to be clear: I pop in a couple of new batteries and it's all back to business as usual!) According to my calculations the batteries should last around 16hs (AAA batteries have a listed capacity of normally 1000mAh or more, so 1000mAh/60mA equals the 16hs), which is at least 4 times what they're lasting now.

So my general question is: what am I missing? (Aside from the obvious: much more knowledge and experience!) And would adding resistances fix this and give my batteries a longer life or would it just be the same?

Looking forward to learn, so Thanks in advance for your time and input!
 

franktronics

Joined Nov 27, 2018
1
Try limiting the current with a resistor.

"(can they draw more?)"

— Yes. You are probably checking the typical ratings on the datasheet. Keep an eye on the maximum ratings also.​
 

dl324

Joined Mar 30, 2015
16,918
Consider yourself lucky. The only thing limiting the current in the LEDs, which I assume you have wired in parallel, was their intrinsic resistance.

Here's an IV curve for some white LEDs I have:
upload_2018-11-27_16-7-13.png

If one of these was connected across a 3V battery, a typical LED would draw around 5mA.

Yours were obviously drawing more because the batteries got warm. The typical forward voltage of the above LEDs is 3.5V, max is 4.1V.
 

AlbertHall

Joined Jun 4, 2014
12,346
That 1000mAh rating will be at some specific current drain and for a specific battery type. As the current drain increases the total power reduces and AAA batteries vary greatly in their ability to supply high currents.

There is test data here for a large variety of battery types in the list further down the page. See if you can find the actual battery you were using.
https://rightbattery.com/328-1-5v-aaa-varta-superlife-carbon-zinc-battery-tests
 

oz93666

Joined Sep 7, 2010
739
. I know I should always use a resistor for LEDs
.... I notice the batteries get warm (not too much but, still, enough to notice), !
The design aim must be NOT to use resistors .... resistors wast power!!! ...Unfortunately this is rarely done because the source voltage varies too much ...

Your AAA cells started off at 1.62 Volts each when new ... then slowly went down to below 1.5V as they were used

The reverse voltage of an led varies according to current

the one in this chart from 2.6 to 3.3 before being overloaded , So it's very easy to put the right number of cells in series with the right number of leds such that they can operate well with no resistors , but light will drop off as batteries fail ....

You need to get a multi-meter to know what current you have ... the leds can draw more than 20mA but they won't last many minutes at 30mA

AAA cells have a 1Ahr capacity when current is 10mA ... your current is higher and will give lower capacity , I can't imagine you could detect a warming of the cells ....

Measure the current , then you can adjust the wiring arrangement so you need no resistors. If two AAAs in series gives too much voltage , then try 3 AAAs in series to drive 2 leds in series , as long as max current in leds doesn't exceed 20mA.

A multimeter will not give an accurate reading for current , because at such low voltage the resistance of the meter will significantly reduce current , but it will give you an idea of the current.
 
Last edited:

crutschow

Joined Mar 14, 2008
34,431
AAA batteries are listed as having a typical drain of 10mA (which I'm reading as being somehow regulated, meaning they can't just offer whatever current)
No, they are definitely not current regulated.
10mA is just a typical measurement current.
The AAA batteries will provide whatever current they can up to the limit of their internal resistance (which is likely less than an ohm for new batteries), thus the need for a series resistor to drive LEDs.
 
Last edited:

ebp

Joined Feb 8, 2018
2,332
... If one of these was connected across a 3V battery, a typical LED would draw around 5mA. ...
0.5 mA (curve crosses 3 V line at below 10^0 = 1)

===
But note that at about 3.3 V the current is up to 5 mA - the current has increased by a factor of 10 for a change in voltage by a factor of 1.1. The shape of the curve depends on the specific LED. In practice, a LED behaves as a semiconductor junction, which has a "forward voltage" that has a logarithmic relationship to the current through it, in series with a resistor which has a linear relationship between voltage and current). In dl324's graph, the logarithmic behavior is holding up fairly well below about 10 mA, but then the resistance becomes more dominant at higher current (the resistance is still there at low current, it just contributes a less significant portion of the overall voltage across the LED). Generally, a LED that is made for high power has lower resistance than one made for low power. Note that this resistance isn't something added deliberately, but an unavoidable product of the way LEDs are made. With small LEDs like you are using, the resistance can actually prevent gross overcurrent with moderately excessive applied voltage, and keep the LEDs from burning up. This is how you got away with connecting the LEDs directly to the battery.

The light emitted by a LED is more or less directly proportional to the current through it, not the voltage across it.

Given sufficient voltage, the current through a LED will rise to a level that will cause it to be destroyed by excessive temperature. Paralleled LEDs don't share current very well. There is always some difference in forward voltage and the forward voltage decreases with rising temperature. The LED that conducts at a lower voltage will "hog" more than its share of current, heat up more, hog more current, and possibly be destroyed. Many high-power LED modules consist of several paralleled strings of several LEDs in series. This generally works OK because differences tend to "balance" and the temperature of all the LED chips is kept about the same because of an aluminum substrate for the assembly.

More sophisticated circuits for driving LEDs from batteries use "switch mode" power converters to actually directly regulate the current through the LED(s). Depending on the design, a "switcher" can produce an output voltage less than or greater than the input voltage so you can do things like operate a high-power LED that runs at about 3.5 V from 2 alkaline or carbon-zinc cells in series, or (for usually for better efficiency and longer running life) from 4 or more cells in series, while keeping the current through the LEDs and hence the brightness constant.
 

MisterBill2

Joined Jan 23, 2018
18,502
If the batteries are able to provide the rated voltage, perhaps up to 1.6 volts each, then each of the three LEDS will be drawing the nominal 20Ma at 3.2 volts, for a total of 60MA, which is a whole lot of current to demand from an AAA cell. Yes, the rated capacity may be 1000Ma-hours, but that is at a much lower current. At the higher current the life will be much shorter. and as the voltage drops the current will fall sharply and so the light will drop quite rapidly.
 

MisterBill2

Joined Jan 23, 2018
18,502
With no explanation it is not quite clear about that green curve. BUT the red curve shows that current rises sharply as the LEDs go into conduction, just as predicted. And so as the battery current approaches 60mA the battery charge is rather quickly consumed. That is exactly what theory predicts.
 

Bordodynov

Joined May 20, 2015
3,179
On the x-axis voltage on one element, the green line is the voltage on the battery of two elements. I took on the X-axis on one element, so that I could see how the current of the LEDs drops sharply when the element is not discharged.
My previous scheme allows you to completely discharge the battery. If the author of the post wants to extend the work of the lamp and allows for a change in brightness due to a smaller current (below nominal), then let him say what current is needed. I will change the ratings.
 

Thread Starter

xbonet

Joined Nov 19, 2018
10
Wow! Thanks a lot for all your replies! I didn't expect to get so many and so full of interesting information! I will need some time to wrap my head around all the data and calculations you guys make, as a lot of it is way over my head at this time. But I do get the gist of it: LEDs can draw more than 20mA and even if they couldn't, 20mA x 3 LEDs is sufficient to drain the power of the batteries much quicker and, as the voltage goes down, so does the current, so it's to be expected that the light dims.

It seems that my only two solutions here are to either wire into the circuit resistors for each LED of a sufficient rating so that they will restrict the current, but sufficiently low so as not to drop too much voltage and therefore lose (and waste) power; OR to hook the circuit up to a USB cable, place resistors to drop the excess voltage, and drive it from a source whose power won't ever fluctuate. The second option is more practical in the long run, I feel. The first will help me continue experimenting.

Would connecting the LEDs in series help at all? What difference, if any, would that make in this particular scenario?

P.S. I've noticed that each time I turn on the circuit after not having used it for about 24hs, the light is much brighter, and almost as bright as when the batteries are new. As the minutes pass, they start dimming back down. I'm guessing then that this has to do with what one of you guys mentioned, that forward voltage decreases with rising temperature. So if I'm drawing so much current from the batteries that they begin to get warm, it seems natural that the warmer they get, the more the forward voltage decreases, so the dimmer the LEDs get... I hope I'm right in this inference, because it would mean I'm understanding something of what you guys are telling me!

Again, thanks a million for all your help and good info you've given me to further my learning!
 

MisterBill2

Joined Jan 23, 2018
18,502
Connecting LEDs in series does two things. First, it assures that the current in each one is the same, and second it increases the forward voltage drop to the sum of the voltage drops at that current. So with two of the 3.2 volt LEDs in series you could safely drive them with the 5 volts from a USB connection. The light will be less but no power would be wasted in a resistor. And you could put 2 sets of two in parallel and the light may be bright enough for your application. The LED signs that I have repaired all have series strings of LEDs with only one resistor in series, all running off of 12 volt supplies.. So the series application can be quite useful.
 

k7elp60

Joined Nov 4, 2008
562
Front LED\'s on.jpg I have been connecting LED's in parallel for years. On way I do it is determine the current I want for each LED. Then with a constant current source set to the current I want I match the LED's on forward voltage within about 50 millivolts at the constant current. Then I determine the power source voltage and the # of LED's I plan to connect in parallel. I use ohms law to calculate the resistor. For example if I am using 4 LED's @ 5mA each and the LED voltage is 2.9V then R=0.1/.02mA or 5 ohms.
A typical 20mA max current LED with have half the intensity at 5mA. I have found through experimentation that a lot of LED's are clearly visible with currents as low or less than 1mA.
Here are pictures of some of my projects with LED'S in parallelFront LED\'s on.jpg k7elpbadgeA.jpg christmas.jpg
 

oz93666

Joined Sep 7, 2010
739
Since you are powering by batteries , it seems the main consideration is to get the maximum light possible ,

I would wire them in series , and drive them perhaps at 2 or 5mA (no resister) and not at the max of 20mA at this low power they are 35% more efficient than running at 20mA .. if one series line is not enough light add more ... Leds usually come in a bag of 100 , they're very cheap , and it will cut your battery bill .
 

MisterBill2

Joined Jan 23, 2018
18,502
A switch-mode controller is definitely another option, but quite a bit more complex than a single resistor or a series connection. If you choose a lower peak LED current the battery life may be improved a bit.
 
Top