Hello, all!
I've just started teaching myself about electronics (I'm currently in Volume 1, Chapter 12 on this site). As a project, I want to make an LED "sign" to put in my game room. The sign will consist of 185 10mm LEDs (105 green, 80 blue). The specs for the LEDs that I'm considering purchasing can be found here (green) and here (blue).
Both colors have a MIN Vf of 3.00 V, a TYP Vf of 3.15 V and a MAX Vf of 3.30 V.
Here's the If vs. Vf graph for both colors:
Like I said, I'm a beginner, so please correct me if I'm wrong here...
According to that graph, if I supply approximately 5 mA of current to each LED, the voltage dropped across each one will be the "typical" 3.15 V. Or, if I supply 20 mA of current to each LED, the voltage dropped across each one will be the "maximum" 3.30 V. Is this correct?
To begin planning my project, I wanted to find a suitable DC power supply. I've got a spare DC "wall wart" that outputs a maximum of 300 mA at a range of different voltages (it's adjustable from 1.5 V to 12 V). When testing it with my voltmeter last night, I found that it's actually supplying approximately 3 V higher than the number shown on the adjustment. For example, when I set it to the 7.5 V setting, my voltmeter read 10.23 V directly across the leads. That's no big deal though...
As for the layout of my circuit (basically just an LED array), I plan on connecting several LEDs in series with a resistor, then connecting these groups in parallel with each other.
First off, I want the LEDs to be as bright as possible without overdriving them and shortening their lifespan. So, would it be okay for me to supply them with 20 mA, resulting in each one dropping 3.30 V (the "maximum")? Or should I go with the "typical" voltage drop of 3.15 V and only supply them with 5 mA? If I do that, will the LEDs be significantly dimmer than they would be at 20 mA and 3.30 V?
Assuming I choose to use the DC "wall wart" at 10.23 V, and I supply each LED with 5 mA (3.15 V across each one), then I could construct my array from groups of three LEDs in series with a resistor, all connected in parallel with each other (I'd have only two LEDs in one of the series groups, since I'm using a total of 185 LEDs). Now I need to choose my resistor. This is where I get confused.
Here's my math (using a 10.23 V source, 5 mA current and 3.15 V drop for each LED):
[Total voltage for all LEDs in a branch]
3.15 * 3 = 9.45 V
[Desired voltage drop for resistor]
10.23 - 9.45 = 0.78 V
[Resistor voltage drop divided by desired branch current]
0.78 / 0.005 = 156 Ω
Since there is no 156 Ω resistor, I'll have to use the next closest value of 160 Ω. Recalculating with that figure...
[Find the voltage dropped by the 160 Ω resistor]
160 * 0.005 = 0.8 V
[Find the total voltage drop of all three LEDs]
10.23 - 0.8 = 9.43 V
[Find the voltage drop of one LED]
9.43 / 3 = 3.14 V
These numbers are still close to my target of 3.15 V and 5 mA for each LED, but I'm getting really confused as to why the current in the circuit stays the same, even though the voltage dropped across each LED has changed with my new resistor value. If each LED's voltage drop has changed (3.14 V from 3.15 V), wouldn't logic suggest that the current drawn by the LEDs would also change? It's very possible that I'm interpreting the math wrong, but according to my Ohm's Law equations above, the current would remain at 5 mA with either resistor. What am I missing?
Just for argument's sake, lets say that I used a 300 Ω resistor...
[Find the voltage dropped by the 300 Ω resistor]
300 * 0.005 = 1.5 V
[Find the total voltage drop of all three LEDs]
10.23 - 1.5 = 8.73 V
[Find the voltage drop of one LED]
8.73 / 3 = 2.91 V
See what I mean? By using a 300 Ω resistor, each LED's voltage drop has changed to 2.91 V, but the current remains at 5 mA. I know this is not right, but I don't understand what I'm doing wrong. If anyone could help me out, I'd REALLY appreciate it!
Back to my project...
So, at this point I've figured that I need a 160 Ω resistor in series with three LEDs (using a 10.23 V source) to get my desired current of 5 mA. Assuming that's correct (PLEASE correct me if I'm wrong), then why does the "LED series parallel array wizard" (found here) tell me that I should be using 180 Ω resistors instead?
If you go to the wizard, plug in the same numbers that I've been using here:
Source voltage = 10.23
Diode Vf = 3.15
Diode If = 5
Number of LEDs = 185
As you can see, I've got the basic concepts down, but I'm getting confused in the math/theory somewhere... I suspect that it has to do with the fact that the relationship between the LED's Vf and If is not linear, and I'm just not grasping the concept correctly. So, if anyone can help straighten me out, I'd REALLY appreciate it. I'm trying to learn as much as I can here, so feel free to go into as much detail as you'd like!
Thanks in advance!
-Lee
I've just started teaching myself about electronics (I'm currently in Volume 1, Chapter 12 on this site). As a project, I want to make an LED "sign" to put in my game room. The sign will consist of 185 10mm LEDs (105 green, 80 blue). The specs for the LEDs that I'm considering purchasing can be found here (green) and here (blue).
Both colors have a MIN Vf of 3.00 V, a TYP Vf of 3.15 V and a MAX Vf of 3.30 V.
Here's the If vs. Vf graph for both colors:

Like I said, I'm a beginner, so please correct me if I'm wrong here...
According to that graph, if I supply approximately 5 mA of current to each LED, the voltage dropped across each one will be the "typical" 3.15 V. Or, if I supply 20 mA of current to each LED, the voltage dropped across each one will be the "maximum" 3.30 V. Is this correct?
To begin planning my project, I wanted to find a suitable DC power supply. I've got a spare DC "wall wart" that outputs a maximum of 300 mA at a range of different voltages (it's adjustable from 1.5 V to 12 V). When testing it with my voltmeter last night, I found that it's actually supplying approximately 3 V higher than the number shown on the adjustment. For example, when I set it to the 7.5 V setting, my voltmeter read 10.23 V directly across the leads. That's no big deal though...
As for the layout of my circuit (basically just an LED array), I plan on connecting several LEDs in series with a resistor, then connecting these groups in parallel with each other.
First off, I want the LEDs to be as bright as possible without overdriving them and shortening their lifespan. So, would it be okay for me to supply them with 20 mA, resulting in each one dropping 3.30 V (the "maximum")? Or should I go with the "typical" voltage drop of 3.15 V and only supply them with 5 mA? If I do that, will the LEDs be significantly dimmer than they would be at 20 mA and 3.30 V?
Assuming I choose to use the DC "wall wart" at 10.23 V, and I supply each LED with 5 mA (3.15 V across each one), then I could construct my array from groups of three LEDs in series with a resistor, all connected in parallel with each other (I'd have only two LEDs in one of the series groups, since I'm using a total of 185 LEDs). Now I need to choose my resistor. This is where I get confused.
Here's my math (using a 10.23 V source, 5 mA current and 3.15 V drop for each LED):
[Total voltage for all LEDs in a branch]
3.15 * 3 = 9.45 V
[Desired voltage drop for resistor]
10.23 - 9.45 = 0.78 V
[Resistor voltage drop divided by desired branch current]
0.78 / 0.005 = 156 Ω
Since there is no 156 Ω resistor, I'll have to use the next closest value of 160 Ω. Recalculating with that figure...
[Find the voltage dropped by the 160 Ω resistor]
160 * 0.005 = 0.8 V
[Find the total voltage drop of all three LEDs]
10.23 - 0.8 = 9.43 V
[Find the voltage drop of one LED]
9.43 / 3 = 3.14 V
These numbers are still close to my target of 3.15 V and 5 mA for each LED, but I'm getting really confused as to why the current in the circuit stays the same, even though the voltage dropped across each LED has changed with my new resistor value. If each LED's voltage drop has changed (3.14 V from 3.15 V), wouldn't logic suggest that the current drawn by the LEDs would also change? It's very possible that I'm interpreting the math wrong, but according to my Ohm's Law equations above, the current would remain at 5 mA with either resistor. What am I missing?
Just for argument's sake, lets say that I used a 300 Ω resistor...
[Find the voltage dropped by the 300 Ω resistor]
300 * 0.005 = 1.5 V
[Find the total voltage drop of all three LEDs]
10.23 - 1.5 = 8.73 V
[Find the voltage drop of one LED]
8.73 / 3 = 2.91 V
See what I mean? By using a 300 Ω resistor, each LED's voltage drop has changed to 2.91 V, but the current remains at 5 mA. I know this is not right, but I don't understand what I'm doing wrong. If anyone could help me out, I'd REALLY appreciate it!
Back to my project...
So, at this point I've figured that I need a 160 Ω resistor in series with three LEDs (using a 10.23 V source) to get my desired current of 5 mA. Assuming that's correct (PLEASE correct me if I'm wrong), then why does the "LED series parallel array wizard" (found here) tell me that I should be using 180 Ω resistors instead?
If you go to the wizard, plug in the same numbers that I've been using here:
Source voltage = 10.23
Diode Vf = 3.15
Diode If = 5
Number of LEDs = 185
As you can see, I've got the basic concepts down, but I'm getting confused in the math/theory somewhere... I suspect that it has to do with the fact that the relationship between the LED's Vf and If is not linear, and I'm just not grasping the concept correctly. So, if anyone can help straighten me out, I'd REALLY appreciate it. I'm trying to learn as much as I can here, so feel free to go into as much detail as you'd like!
Thanks in advance!
-Lee