Calculating resistor value for LEDS

Thread Starter

propmaker

Joined Jun 6, 2019
1
Hi; I feel like a twad for asking as this seems a ridiculously simple question...

I started on a prop which has 5 LEDS in parallel. They are blue LEDs, so they draw 3.3V and each uses 20Ma of current.

Power is coming from a battery pack which on paper outputs 3.7V; but I've measured it and it outputs 4V.

Naively I thought it'd work without a resistor and well... for five minutes I was right, now one of the LEDS started flickering. So, time for a resistor.

The formula for this is:

R = (Vinput -Vused)/drawnCurrent
so R = (4 -3.3)/(5*0,020)
R = 0.7/0.1 = 7

7 ohms seems like such a small number that I start to doubt if I'm correct. The prop itself is fairly small, and I'm not that practiced when it comes to soldering. So I was hoping to use only one resistor in stead of putting one in series in front of each LED (hence the 5*0.02)

Am I actually correct that I need one resistor of 7 ohms to balance it all out? What will happen if the batterypack drains out more power and actually outputs the value it should(3.7V or oven lower) rather then the measured 4V?

All the best!
 

crutschow

Joined Mar 14, 2008
34,464
Yes, you always need a resistor to limit the current to an LED.
The 3.7V rating of the LED is the voltage it drops at it's rated current, not the supply voltage you use.

But using a 4V battery for a 3.3V rated LED does not leave much margin for the resistor, as you noted.
And operating LEDs in parallel is problematic due to the differences in voltage drop from unit to unit (which is especially noticeable when the supply voltage is close to the LED voltage).

So the resistor value for each LED for 20mA would be 0.7V/20mA = 35Ω.

However, if the battery drops to 3.7V, then most or all of the LEDs will stop glowing.
You may have to consider an higher battery voltage or a switching boost regulator in increase the voltage to the LEDs.

For best efficiency you could use a constant-current switching boost regulator and drive all the LEDs in series.
That requires no added resistors.
 

KeithWalker

Joined Jul 10, 2017
3,097
Hi; I feel like a twad for asking as this seems a ridiculously simple question...

I started on a prop which has 5 LEDS in parallel. They are blue LEDs, so they draw 3.3V and each uses 20Ma of current.

Power is coming from a battery pack which on paper outputs 3.7V; but I've measured it and it outputs 4V.

Naively I thought it'd work without a resistor and well... for five minutes I was right, now one of the LEDS started flickering. So, time for a resistor.

The formula for this is:

R = (Vinput -Vused)/drawnCurrent
so R = (4 -3.3)/(5*0,020)
R = 0.7/0.1 = 7

7 ohms seems like such a small number that I start to doubt if I'm correct. The prop itself is fairly small, and I'm not that practiced when it comes to soldering. So I was hoping to use only one resistor in stead of putting one in series in front of each LED (hence the 5*0.02)

Am I actually correct that I need one resistor of 7 ohms to balance it all out? What will happen if the batterypack drains out more power and actually outputs the value it should(3.7V or oven lower) rather then the measured 4V?

All the best!
You have calculated correctly but if you use a single resistor, the leds will probably have noticeably different intensities unless they are a matched set because the current/voltage curve will be slightly different for each. As long as you are not driving them at anywhere near the continuous maximum value, you can get away with that if you don't mind the different intensities.
Leds are current driven devices so you will get much more even results if you use a 33 ohm resistor (nearest standard value to 35 ohms) in series with each led.
As the battery voltage drops, the intensity of the leds will decrease. When using a single resistor, the difference in led intensities will become much more apparent.
 

djsfantasi

Joined Apr 11, 2010
9,163
Yes, you always need a resistor to limit the current to an LED.
The 3.7V rating of the LED is the voltage it drops at it's rated current, not the supply voltage you use.

But using a 4V battery for a 3.3V rated LED does not leave much margin for the resistor, as you noted.
And operating LEDs in parallel is problematic due to the differences in voltage drop from unit to unit (which is especially noticeable when the supply voltage is close to the LED voltage).

So the resistor value for each LED for 20mA would be 0.7V/20mA = 35Ω.

However, if the battery drops to 3.7V, then most or all of the LEDs will stop glowing.
You may have to consider an higher battery voltage or a switching boost regulator in increase the voltage to the LEDs.

For best efficiency you could use a constant-current switching boost regulator and drive all the LEDs in series.
That requires no added resistors.
3.7V is the rated battery voltage.
3.3V is the LED forward voltage.
4.0V is the observed battery voltage.
20mA is the maximum LED current.

First, use a lower current rating. Like 15mA.
Second, use the battery’s rated voltage.

R=(3.7-3.3)/.015
R=0.4/.015
R=26.6Ω

Using standard values, 22Ω gives us 18ma. 30Ω gives us 13mA. I’d try the 22Ω first.
 

Audioguru

Joined Dec 20, 2007
11,248
The battery IS NOT rated at 3.7. Since it is probably a rechargeable Lithium cell then its voltage is 3.2V when it should be disconnected and it is 4.20V when fully charged. Then 3.7V is its average voltage during a full discharge.

Guess what? The LED's forward voltage IS NOT 3.3V. An LED part number has a range of voltages which is shown in its datasheet. It might be 2.8V for some and be 3.8V for others. Some might be the average of 3.3V so with such a low battery voltage you must customize the resistor value for each LED and throw away the higher voltage LEDs.
 
Top