Why We Should Connect a Resistor in Series with an LED

Thread Starter

PRAKASH KOPPAL

Joined Sep 18, 2015
1
UNDERSTANDING.....!!!! WHY WE SHOULD CONNECT A RESISTOR IN SERIES WITH LED

Friends we know that resistor is a current limiting component and it is also a voltage divider..
volt
I will show you how voltage will be divided by considering a simple led circuit ..
Friends I have one RED LED whose Power Rating as below

Standard Red LED
Max.Voltage 1.7V
Max.Current 10mA

Now if I want to glow My LED in sufficient Manner Means I want to use 1.7 Volts DC power supply but we cant get 1.7 Volts DC voltage Source (cell or battery )in market. Instead of that I have a DC voltage Source(Cell or Battery) of 9 Volts. then what should I do now...? Because if I connect a 9 Volt Dc source to that LED means Definitely My LED will Burn out...
so without burning LED and using 9 Volt Battery I want Glow LED in a sufficient Manner....
Then How It is possible..........?
yes, it it is possible to glow a 1.7 Volt LED with 9 Volt DC source ,by connecting a Resistor in series with LED

Working Of Resistor:
Friends all are Tells only that, Resistor is Electronic Component which limits the flow of current. But also Helps to divide the Voltage

I give you Simple formula that
Finding Resistor Value to Divide or provide an output of 1.7 Volts from 9 Volt Battery

i.e,

Resistance= Input Voltage - Output Voltage
Required current

R= 9 Volt -1.7 Volt
10mA
R= 730 Ohm

It Means that if I give 9 Volts as a input to 730 Ohm Resistor means My output will be 1.7 Volts with 10 mA Current.
1.73 Volts Is given to led and remaining 7.3 voltage is dropped across resistorled1.PNG ..
 

dl324

Joined Mar 30, 2015
16,918
You don't need a current limiting resistor if you're driving the LED with a current source.

Your voltage divider reference is a bit of a stretch as the dynamic impedance of the LED when it's conducting 10mA is relatively small and therefore negligible.

I get that you wanted to stress the importance of using a resistor, but capitalizing an entire sentence amounts to shouting these days and is less socially acceptable than it once was...
 

dannyf

Joined Sep 13, 2015
2,197
WHY WE SHOULD CONNECT A RESISTOR IN SERIES WITH LED
No reason to shout - we can all hear you perfectly fine without all the shouting.

There are plenty of cases where LEDs are driven without resistors -> the revolution over the last decades or so in lighting is largely about driving leds without resistors.

I would venture to say that there are far more leds being driven without resistors than leds with resistors. Just look at your smart phones, tablets or TVs....
 

ErnieM

Joined Apr 24, 2011
8,377
I agree a current source is the preferred method of driving a LED from an efficiency standpoint.

Resistance is futile.
 

Brainbox

Joined Nov 15, 2010
25
The word "divided" is used here to mean "shared", ie. the voltage of 9v is shared between the led and the resistor - which is true.
Why then use divided instead of shared?
When the resistor and LED are in parallel, they still "share" the same voltage, right?
So You could put them in parallel as well to do the job?
 

Roderick Young

Joined Feb 22, 2015
408
I see a lot of good answers here, and would add that an LED is still a diode. It may not be a silicon diode, but to the first order, it still obeys the diode equation, which is

\(I = I_0(e^{{qV}/kT} - 1)\)

where \(I\) is the current going through the diode, and \(V\) is the voltage applied across the diode. \(I_0\) is just a constant. In practical terms, we can ignore the -1 part when the LED is lit, and at room temperature, kT/q is about 0.025 volt, so the equation boils down to

\(I = I_0e^{{V}/0.025}\)

If you could precisely control the supply voltage to exactly the right value, and keep temperature absolutely steady, yes, you could drive the LED with a voltage source, and everything would be fine. However, in real life, the forward voltage varies between parts, and it's impractical to control voltage and temperature with such precision. If you do a little math on the equation above, you can see that if the supply voltage changes by as little as 0.057 volt either way, the current going through the LED could be 1/10 of nominal (making it very dim), or 10 times nominal, probably burning it out. And that doesn't even account for differences in manufacturing between individual LEDs, nor changes in temperature like a hot or cold day.

Putting a resistor in series with the LED is a cheap way of keeping the current through the LED about the same, even variances between different LEDs, temperature changes, and supply voltage changes.
 

dannyf

Joined Sep 13, 2015
2,197
In the schokley model, as V approaches zero, the current approaches zero too.

another way to say it is that if the current through a diode is sufficiently small, the voltage drop over it can be arbitrarily low.

regardless how others may tell you otherwise.
 

dannyf

Joined Sep 13, 2015
2,197
in real life, the forward voltage varies between parts
forward voltage variation only matters if you are paralleling multiple leds. It cannot explain why resistors are used on a single string of leds.

Two things about leds:

1) their voltage drop decreases with temperature; and
2) their (dynamic) resistance decreases with voltage / current.

So if you apply a constant voltage to a diode, once it starts to conduct, it starts to dissipate energy, heating up in the process. When it heats up, its current goes up as a constant voltage is applied to the diode. Higher current means higher dissipation and higher dissipation means more heat -> you have a thermal run-away. The same issue that destroys bjts.

Two solutions to this problem:
1) use a resistor -> very inefficient for high power applications;
2) drive the led with a current source. An (ideal) current source is essentially an ideal voltage source with infinite output resistance.
 

Wendy

Joined Mar 24, 2008
23,421
In the schokley model, as V approaches zero, the current approaches zero too.

another way to say it is that if the current through a diode is sufficiently small, the voltage drop over it can be arbitrarily low.

regardless how others may tell you otherwise.
Not if you want to use the LED for its intended purpose.
 

Roderick Young

Joined Feb 22, 2015
408
forward voltage variation only matters if you are paralleling multiple leds. It cannot explain why resistors are used on a single string of leds.

Two things about leds:

1) their voltage drop decreases with temperature; and
2) their (dynamic) resistance decreases with voltage / current.

So if you apply a constant voltage to a diode, once it starts to conduct, it starts to dissipate energy, heating up in the process. When it heats up, its current goes up as a constant voltage is applied to the diode. Higher current means higher dissipation and higher dissipation means more heat -> you have a thermal run-away. The same issue that destroys bjts.

Two solutions to this problem:
1) use a resistor -> very inefficient for high power applications;
2) drive the led with a current source. An (ideal) current source is essentially an ideal voltage source with infinite output resistance.
Yes, I agree. I was a little cavalier with my words. I meant that \(I_0\) would vary due to manufacturing tolerances, with the practical effect that the forward voltage at nominal operating current would vary between parts.
 
Top