# Calculating Forward Current

Discussion in 'General Electronics Chat' started by Markopoelo, Jun 1, 2013.

1. ### Markopoelo Thread Starter New Member

Jun 1, 2013
3
0
I have a red 10mm LED with a maximum forward voltage rating of 2.4 volt and a forward current rating of 20mA. I want to calculate the amount of resistance needed in the series circuit to ensure that I don't exceed the maximum forward current. It is my understanding that we use R = (Vs - Vf)/If.

I have a supply voltage of 6.0 and I use the maximum voltage forward rating of 2.4. My math says the 6.0 - 2.4 /.020 = 180. As I understand it, 180 ohms should keep me at or below the 20mA rating. It does not, and I am frustrated and confused. 180 ohms is roughly half of the resistance I need. When I measure the current, I get a reading of about 32mA. I do make sure that I use the meter correctly by inserting into the circuit.

When I use R = V/I, my numbers are dead on. The current is correct at the led and the forward voltage is at around 1.8. It seems to me that using the voltage forward formula that I'm actually saying that my supply voltage is 3.6 because of subtracting the max forward voltage from the supply voltage of 6.0. What am I doing wrong? What is the importance of the forward voltage?

Apr 5, 2008
15,648
2,347
Hello,

Is the powersupply a regulated one?
If not, the voltage can be much higher.

Bertus

3. ### LDC3 Active Member

Apr 27, 2013
920
160
As with many components, each one is different. The maximum forward voltage is where the LED is rejected for being out of specification. Most LEDs have a lower maximum forward voltage.

Markopoelo likes this.
4. ### Markopoelo Thread Starter New Member

Jun 1, 2013
3
0
Hi Bertus:
I believe it is. I can adjust the voltage from 3 - 12 volts at 1A

5. ### crutschow Expert

Mar 14, 2008
13,028
3,238
To insure that that current never goes over 20mA you need to use the minimum forward drop of the LED, not the maximum, when you calculate the series resistor value needed.

Apr 5, 2008
15,648
2,347
Hello,

Best is to measure the voltage from the powersupply.

I also had an unregulated selectable powersupply.
This one gave almost 15 volts on the 12 volts setting.

Bertus

7. ### Markopoelo Thread Starter New Member

Jun 1, 2013
3
0
Well mine is regulated, bertus. The actual voltages measured with the multimeter matches whats shown on the voltage selector on the power supply.

@crutschow, Thanks for your response, sir. The minimum forward voltage is not shown on the wrapper. I'm guessing it is the 1.8 volts I keep seeing. I even increased the voltage to 12V while increasing the resistance accordingly and still arrive at the 1.8 forward voltage. I think I will just used the tried and true I = V/R to ensure I keep the current where it's supposed to be. I hope the forward voltage isn't to significant as LDC3 pointed out because I can't adjust it. You gentlemen have been a big help. Thanks to you all.