# Newb LED Question

Joined Aug 2, 2011
8
I have read 100 tutorials on LED's and looked over many schematics, but none cover my question.

Do I need to use a resistor if the forward voltage = the voltage supplied?

x4 LED's
Forward Voltage: 3.0
Forward Current 20mA
Power Source: 12v

If I was using one of these LED's I could see needing a 470 resistor, but since I am using 4 LED's it's going to equal the supplied voltage. If I do use a resistor, I would not have the required voltage to power all 4 LED's. So, I just don't use a resistor right? But EVERY tutorial say's "ALWAYS USE A RESISTOR"... Is there something I am missing?

*These LED's will be hooked up in series. NOT in Parallel

If you know the answer, can you please explain it in detail. I am a quick learner, I promise I will understand #### SgtWookie

Joined Jul 17, 2007
22,210
No, you can't just connect them up to a supply that is regulated by voltage, even though it SEEMS like it could work.

The Vf (forward voltage) of diodes decreases as their temperature increases. So, what can happen is that as the LEDs get heated by the current flowing through them, their Vf decreases, which causes more current to flow through them, which makes them hotter yet and decreases their Vf some more, which ....etc. until the LEDs melt down or otherwise burn up.

If the supply is current-regulated to suit your LEDs, then you can omit the resistors.

If the supply is voltage regulated, then you should subtract 1v or ~10% from the input voltage before you attempt to figure out how many you can operate in series.

If the voltage supply is not very well regulated, you need to find out what the extremes are, and check to see if the current will be acceptable at both the highest and lowest extremes.

Joined Aug 2, 2011
8
Excellent answer, so since I am running 4x 3.0v LED's on a 12v source you would recommend running them with a 50 ohm resistor, correct?

My math: 12v source, assume 13v for protective margin. Remainder of 1 volt, so 1 / 0.020a = 50 [ohms]

Might run a bit dim until warmer, correct?

#### SgtWookie

Joined Jul 17, 2007
22,210

MaxNumberOfLEDsInSeries = INT((Vsupply - (1 or 10%)) / Vf_LED)
So, INT((12-1)/3) = 3.
Then:
Rlimit >= (Vsupply - (Vf_LED * MaxNumberOfLEDsInSeries)) / DesiredCurrent
Substituting:
(12v - (3.0v * 3) ) / 20mA = (12-3)/0.02 = 3/0.02 = 150 Ohms.

150 Ohms is a standard value of resistance. Now if it came up 450 Ohms, you would need to look at a table of standard resistor values:
Use the E12 (yellow) and E24 (green) columns. 450 isn't shown, but 430 and 470 are the closest values shown. If you look again at the Rlimit equation, you'll see that Rlimit has to be greater than or equal to 450 Ohms, so 470 Ohms it is.

If for some reason you want to get closer than 470 Ohms, you can see what combinations of resistors you might use in series or parallel to achieve that using the link below.

If you go ahead and try it, you'll see 470 listed at the bottom; it's only 4.44% too high.

Next, you need to calculate the wattage required:
Rwattsreqd >= (Vsupply - (Vf_LED * MaxNumberOfLEDsInSeries)) * Desired_Current * 1.6

The 1.6 gives us a safety factor of 60% over the exact wattage requirements. This is for reliability, and to help keep the resistors from getting warm/hot.
The result is 96mW. You can use a 1/10 Watt (100mW) or higher wattage rated resistor; eg: 1/8W, 1/4W, etc.

Last edited:

Joined Aug 2, 2011
8
I trust your opinion, but am not sure it's correct.

If I was running them in Parallel or a single LED 470 would make sense to me. But since I am running them in serial if I was using 3 LED's a 150 would make sense. Please let me know where I am confused.

#### SgtWookie

Joined Jul 17, 2007
22,210
You are correct! I was not paying enough attention and typed 9 instead of 3 and kept calculating based on that error; I have since revised my post.