Why do we need a current limiting resistor with LEDs?

Thread Starter

btebo

Joined Jul 7, 2017
100
OK - dumb question here... but trying to TOTALLY understand!

I know too much current will damage/destroy an LED and a current limiting resistor is required.

If supply voltage is 5V and Vf of the LED is 1.7V, there is 3.3V that must be dissipated. Using the formula R=(Vs-Vf)/ILED will give me the correct size of the resistor.

If supply voltage was 1.7V and Vf of the LED is 1.7V, the formula would result in R=0 ohms.

But then there would be nothing to limit the current. Would the LED only draw what it needed?
 

wayneh

Joined Sep 9, 2010
17,496
OK - dumb question here... but trying to TOTALLY understand!

I know too much current will damage/destroy an LED and a current limiting resistor is required.

If supply voltage is 5V and Vf of the LED is 1.7V, there is 3.3V that must be dissipated. Using the formula R=(Vs-Vf)/ILED will give me the correct size of the resistor.

If supply voltage was 1.7V and Vf of the LED is 1.7V, the formula would result in R=0 ohms.

But then there would be nothing to limit the current. Would the LED only draw what it needed?
Yes. If you look closely at the specs of an LED, you’ll find a current versus voltage curve. A typical LED goes from zero current to max current (destruction) over the range of about 0.5V. That relationship changes with heat, current increases with temperature, but not enough to go into runaway as long as the voltage is not already near the upper limit.

So running a 20mA LED at 5mA by using a well regulated voltage is completely doable. It’s just not all that practical. For one thing, LEDs vary and dialing in a precise voltage for one LED won’t help you with the next one.
 

Thread Starter

btebo

Joined Jul 7, 2017
100
Yes. If you look closely at the specs of an LED, you’ll find a current versus voltage curve. A typical LED goes from zero current to max current (destruction) over the range of about 0.5V. That relationship changes with heat, current increases with temperature, but not enough to go into runaway as long as the voltage is not already near the upper limit.

So running a 20mA LED at 5mA by using a well regulated voltage is completely doable. It’s just not all that practical. For one thing, LEDs vary and dialing in a precise voltage for one LED won’t help you with the next one.
Thank you!. But say my power supply is set to deliver 1.7V, 1A. Theoretically, would the LED only draw what it needs (for example, 20mA)?
 

Papabravo

Joined Feb 24, 2006
21,159
Thank you!. But say my power supply is set to deliver 1.7V, 1A. Theoretically, would the LED only draw what it needs (for example, 20mA)?
Any particular LED will work with a precisely set power supply, but the next LED may or may not work. The resistor is a mechanism for reducing the voltage across the LED as the current increases. If the LED requires more current, then that increased current will lower the voltage across the LED.
 

WBahn

Joined Mar 31, 2012
29,979
Thank you!. But say my power supply is set to deliver 1.7V, 1A. Theoretically, would the LED only draw what it needs (for example, 20mA)?
THAT particular LED might draw 20 mA at 1.7 V. After it heats up for a bit it might draw 30 mA at that same voltage (just making up numbers, but the trend is the right direction).

But when you take the next LED from the same batch it might only draw 0.5 mA at 1.7 V and the LED after that might draw 50 mA at 1.7 V.
 

MrChips

Joined Oct 2, 2009
30,720
The problem is current regulation, not voltage regulation.
You don't know that the LED will draw 20mA.

Look at the LED I-V curve.


Below the LED turn-on voltage, the slope is not steep and the current is very low.
When the LED turns on, you are riding on the steep part of the I-V curve. A slight change in the voltage will result in a large change in the current. Hence you want to control the current, not the voltage.

With 0Ω resistance in series, you have no currrent control.
The larger the value of the series resistance the better the current control. Hence you want the supply voltage to be higher than the LED turn-on voltage so that you can insert a large series resistance.
 

Thread Starter

btebo

Joined Jul 7, 2017
100
I guess what I'm trying to wrap my head around is this....

My wall outlet can provide a maximum of 15A of current (limited by circuit breaker). If I plug a hair dryer in that draws 10A, it only draws the 10A.

I guess what I'm really trying to understand is, in a perfect world, with a perfect LED with Vf of 1.7V and ILED of 20mA, would it only draw 20 mA?

My question is theoretical - not practical.

Maybe I'm confused with the phrase "draws xxxA"?
 
The forward voltage is both temperature dependent and batch dependent. It's also color dependent. There is a brightness dependency with Vf.

It's "BEST" to "current regulate" the current to the LED.

With transistors, we have "Thermal runaway" for the same reason. When it heats up it conducts more and when it conducts more it heats up more.
 

MrChips

Joined Oct 2, 2009
30,720
A hair drier is not an LED and vice versa.
You cannot compare the two because they have different I-V characteristics.
 

Thread Starter

btebo

Joined Jul 7, 2017
100
A hair drier is not an LED and vice versa.
You cannot compare the two because they have different I-V characteristics.
I know;)
And I promise to everyone here I will ALWAYS use a current limiting resistor.:)

And thank you for taking time to explain the I-V curve and the temperature response of electronic components. I've seen this and "sort of" understood what it was trying to tell me, but it makes much more sense now.
 

wayneh

Joined Sep 9, 2010
17,496
I guess what I'm really trying to understand is, in a perfect world, with a perfect LED with Vf of 1.7V and ILED of 20mA, would it only draw 20 mA?
Yes. For many typical devices and loads, the capacity of the supply has no bearing on what the load demands. You plug a lamp into the wall and it doesn't care that 15A or more could be available - it draws what current it was designed to draw.

Unfortunately an LED is not like an appliance and is a poor choice for illustrating that concept. It's more like a fuse. If you overtax it, it pops. A lightbulb can tolerate a wide voltage swing and still function. No so an LED.
 

Thread Starter

btebo

Joined Jul 7, 2017
100
Yes. For many typical devices and loads, the capacity of the supply has no bearing on what the load demands. You plug a lamp into the wall and it doesn't care that 15A or more could be available - it draws what current it was designed to draw.

Unfortunately an LED is not like an appliance and is a poor choice for illustrating that concept. It's more like a fuse. If you overtax it, it pops. A lightbulb can tolerate a wide voltage swing and still function. No so an LED.
So, putting all of this together, even small variations of voltage or temperature will radically change the current through the LED. The LED has so little tolerance that it will pop quickly (like my ex-wife)!;)
 

Ylli

Joined Nov 13, 2015
1,086
I think the best way to think about this is that household appliances or many other electronics devices operate on the applied volts. LEDs respond to current flow. An LED is a current driven device, not a voltage driven device.

The *best* way to drive an LED is with a constant current source. A fixed voltage well above the LED voltage drop with a series resistor simulates a constant current source with limited compliance.

When working with LEDs, don't think voltage, think current.
 

Thread Starter

btebo

Joined Jul 7, 2017
100
I think the best way to think about this is that household appliances or many other electronics devices operate on the applied volts. LEDs respond to current flow. An LED is a current driven device, not a voltage driven device.

The *best* way to drive an LED is with a constant current source. A fixed voltage well above the LED voltage drop with a series resistor simulates a constant current source with limited compliance.

When working with LEDs, don't think voltage, think current.
Thanks, Ylli. That helps a lot. I never thought of the voltage source/series resistor as a current source - I was always interpreting (incorrectly) it more as a voltage divider circuit (boy was THAT stupid)! Now that you point that out, I understand where my thinking was very flawed....

And thanks again to everyone here who has patience with this NOOB!
 

MrChips

Joined Oct 2, 2009
30,720
In other words:

An ideal current source has infinite source resistance.
An ideal voltage source has zero source resistance.

btw, note the opposite:

An ideal ammeter has zero resistance.
An ideal voltmeter has infinite resistance.
 

WBahn

Joined Mar 31, 2012
29,979
I guess what I'm trying to wrap my head around is this....

My wall outlet can provide a maximum of 15A of current (limited by circuit breaker). If I plug a hair dryer in that draws 10A, it only draws the 10A.

I guess what I'm really trying to understand is, in a perfect world, with a perfect LED with Vf of 1.7V and ILED of 20mA, would it only draw 20 mA?

My question is theoretical - not practical.
Yes, in theory in an ideal world you could get a particular current in an LED by placing exactly the voltage across it that results in the desired current and, yes, as long as the voltage supply you are using can supply at least that much current, the LED would pull just that amount of current and no more because it s a perfect LED in an ideal world.

Maybe I'm confused with the phrase "draws xxxA"?
This is the more important point to get unconfused about. First off, someone shouldn't say things like, "the LED draws 20 mA at 1.7 V," which is not to say that people won't be sloppy and say it anyway.

An LED has a rated forward current, If of ,say 20 mA. It also has a rated forward voltage, Vf, of, say 1.7 V. These are not hard values, they describe the typical LED over a whole bunch of LEDs and doesn't apply to any one LED except as an approximation. What it really means is that we made a whole bunch of LEDs and we put them in a test fixture and ran 20 mA through them and measured the voltage across them at a specific temperature. We then took the voltage readings for hundred, if not thousands, of LEDs. We then examined the data and decided on a lower limit and an upper limit for the voltage and found the mean (or perhaps the median) of all the once between those limits and call that the typical Vf. Now, when we manufacturer them, we measure every single LED we make under those same test conditions and we throw out (or send to a secondary market) all of the ones that are above or below the limits and sell the rest. In the data sheet, we publish the min, max, and typical values.

We may also publish a curve showing how the "typical" -- meaning the average over a whole bunch of LEDs that are within a small fraction of the typical Vf at If -- voltage versus current and also a plot showing the typical Vf at If for different temperatures. What we usually won't do is publish detailed information about the complete distribution because if your circuit needs that information in order to work properly, then you probably need to redesign your circuit in the first place.
 

BobTPH

Joined Jun 5, 2013
8,813
Your example of a hair dryer is a good one to explain the difference. A heating element and an LED react oppositely to increased temperture. A heating element increases its resistance when the temperature rises, so it is self limiting, if it gets too hot the current is lowered.

An LED does the opposite. When the temperature increases, it increases the current for a given voltage, making it even hotter. This is called thermal runaway.

And don’t be so sure apliances do not have current limiting. When a motor stalls, it draws more current and heats up. Often they include a thermal breaker to cut off the current, or wven a fuse .

Bob
 

Thread Starter

btebo

Joined Jul 7, 2017
100
Yes, in theory in an ideal world you could get a particular current in an LED by placing exactly the voltage across it that results in the desired current and, yes, as long as the voltage supply you are using can supply at least that much current, the LED would pull just that amount of current and no more because it s a perfect LED in an ideal world.



This is the more important point to get unconfused about. First off, someone shouldn't say things like, "the LED draws 20 mA at 1.7 V," which is not to say that people won't be sloppy and say it anyway.

An LED has a rated forward current, If of ,say 20 mA. It also has a rated forward voltage, Vf, of, say 1.7 V. These are not hard values, they describe the typical LED over a whole bunch of LEDs and doesn't apply to any one LED except as an approximation. What it really means is that we made a whole bunch of LEDs and we put them in a test fixture and ran 20 mA through them and measured the voltage across them at a specific temperature. We then took the voltage readings for hundred, if not thousands, of LEDs. We then examined the data and decided on a lower limit and an upper limit for the voltage and found the mean (or perhaps the median) of all the once between those limits and call that the typical Vf. Now, when we manufacturer them, we measure every single LED we make under those same test conditions and we throw out (or send to a secondary market) all of the ones that are above or below the limits and sell the rest. In the data sheet, we publish the min, max, and typical values.

We may also publish a curve showing how the "typical" -- meaning the average over a whole bunch of LEDs that are within a small fraction of the typical Vf at If -- voltage versus current and also a plot showing the typical Vf at If for different temperatures. What we usually won't do is publish detailed information about the complete distribution because if your circuit needs that information in order to work properly, then you probably need to redesign your circuit in the first place.
Thank you, WBahn! Another wonderful, in depth answer. I beginning to see the light!
 

MrAl

Joined Jun 17, 2014
11,396
OK - dumb question here... but trying to TOTALLY understand!

I know too much current will damage/destroy an LED and a current limiting resistor is required.

If supply voltage is 5V and Vf of the LED is 1.7V, there is 3.3V that must be dissipated. Using the formula R=(Vs-Vf)/ILED will give me the correct size of the resistor.

If supply voltage was 1.7V and Vf of the LED is 1.7V, the formula would result in R=0 ohms.

But then there would be nothing to limit the current. Would the LED only draw what it needed?

Hi,

Here is another way to look at it. This builds on the idea that a hair dryer is a voltage operated device while an LED is a current operated device.

First, what is the difference between voltage operated and current operated.
Voltage operated means you apply a voltage and you are then pretty much good to go.
Current operated means you apply a current and you are then pretty much good to go.
The voltage operated device can deal with a variation in voltage of say 10 percent without too much difference in current.
The current operated device can deal with a variation in current of say 10 percent without too much difference in voltage.
There is also the question of sensitivity.
The current in the hair dryer is only mildly sensitive to a change in applied voltage.
The voltage in the LED is only mildly sensitive to a change in applied current. However the current in the LED is very sensitive to a change in voltage, if in fact a voltage was applied instead of a current.
A small light bulb is a better device to compare to an LED.
The light bulb is best driven with a voltage source because the life of the bulb is highly dependent on the voltage level.
The LED is best driven with a current source because the life of the LED is highly dependent on the current level.
So applying a voltage to a light bulb is better than applying a current to the light bulb, and applying a current to an LED is better than applying a voltage to the LED.

Now when we use a resistor with the LED we are taking a small step back from applying a current source. It turns out that the resistor helps the voltage source turn into a pseudo current source because if the resistor is selected properly for a large change in LED voltage we see a small change in current.

Also note that the LED voltage is not the same kind of specification as the voltage of a bulb. The LED voltage is called the "characteristic" voltage because it helps to define the overall character of the device and not a strict operating or design point spec.
In other words, if we have a 1.7v LED we know we need *around* 1.7v to get it to operated, but if we had a 3.5v LED we know we need more available voltage to get it to work with a resistor. So we can compare the voltage character of each LED to get a ball park idea what to expect but we can not use those voltage specs as a design point where we set it and go like we can with the light bulb.

This question comes up quite a bit in forums. Probably about 20 times in the past 10 years. That is because we are so used to using voltage operated devices that the LED being current operated is a new experience for us. Also because we often learn how to use voltage sources like batteries and power supplies before we learn how to use current sources. So you are in no way alone with this.

To learn more, bias an LED with a resistor and voltage source and do some measurements. Start with maybe a 5v source and 1k resistor, then lower the resistor to maybe 330 ohms. Note the percentage change in voltage across the LED vs the change in current through the LED.
 

Thread Starter

btebo

Joined Jul 7, 2017
100
Hi,

Here is another way to look at it. This builds on the idea that a hair dryer is a voltage operated device while an LED is a current operated device.

First, what is the difference between voltage operated and current operated.
Voltage operated means you apply a voltage and you are then pretty much good to go.
Current operated means you apply a current and you are then pretty much good to go.
The voltage operated device can deal with a variation in voltage of say 10 percent without too much difference in current.
The current operated device can deal with a variation in current of say 10 percent without too much difference in voltage.
There is also the question of sensitivity.
The current in the hair dryer is only mildly sensitive to a change in applied voltage.
The voltage in the LED is only mildly sensitive to a change in applied current. However the current in the LED is very sensitive to a change in voltage, if in fact a voltage was applied instead of a current.
A small light bulb is a better device to compare to an LED.
The light bulb is best driven with a voltage source because the life of the bulb is highly dependent on the voltage level.
The LED is best driven with a current source because the life of the LED is highly dependent on the current level.
So applying a voltage to a light bulb is better than applying a current to the light bulb, and applying a current to an LED is better than applying a voltage to the LED.

Now when we use a resistor with the LED we are taking a small step back from applying a current source. It turns out that the resistor helps the voltage source turn into a pseudo current source because if the resistor is selected properly for a large change in LED voltage we see a small change in current.

Also note that the LED voltage is not the same kind of specification as the voltage of a bulb. The LED voltage is called the "characteristic" voltage because it helps to define the overall character of the device and not a strict operating or design point spec.
In other words, if we have a 1.7v LED we know we need *around* 1.7v to get it to operated, but if we had a 3.5v LED we know we need more available voltage to get it to work with a resistor. So we can compare the voltage character of each LED to get a ball park idea what to expect but we can not use those voltage specs as a design point where we set it and go like we can with the light bulb.

This question comes up quite a bit in forums. Probably about 20 times in the past 10 years. That is because we are so used to using voltage operated devices that the LED being current operated is a new experience for us. Also because we often learn how to use voltage sources like batteries and power supplies before we learn how to use current sources. So you are in no way alone with this.
You are SO correct! I know recognize a lot of my confusion is understanding the difference between voltage sources vs. current sources. Just a slow learner here!

To learn more, bias an LED with a resistor and voltage source and do some measurements. Start with maybe a 5v source and 1k resistor, then lower the resistor to maybe 330 ohms. Note the percentage change in voltage across the LED vs the change in current through the LED.
Excellent idea! I've taken many electronic courses at our community college here, and remember one experiment of biasing a diode and drawing it's curve. Know that I've learned so much here, I'll re-run the experiment - it will make MUCH more sense this time!
 
Top