# If the source voltage is the same as LED voltage, is a resistor necessary?

Discussion in 'General Electronics Chat' started by geratheg, Jul 27, 2014.

1. ### geratheg Thread Starter Member

Jul 11, 2014
107
3
If the source voltage is the same as the voltage drop across an single LED, is a resistor in series necessary? If so, why?

2. ### crutschow Expert

Mar 14, 2008
16,209
4,333
Yes, a resistor is necessary. The LED is basically a current operated device, not a voltage operated. It has a very low dynamic impedance (similar to a standard diode) so it's voltage changes little for a corresponding large change in current (or conversely a small voltage change generates a large current change).

And the voltage drop varies from unit to unit and with other factors such as temperature, thus there's no way to keep the source voltage equal to the LED voltage since the LED voltage is a variable. That's why LEDs are driven by a current source or a voltage source in series with a resistance to regulate the current.

Some times a button cell will be used to power a LED without an external resistor but, in that case, it's the high internal resistance of such a cell that is limiting the current.

3. ### Shagas Active Member

May 13, 2013
802
74
I've seen LED V-I curves with differences between "Ion" and Imax anywhere between 100-500mV.
You can probably get away with not using a resistor with higher voltage leds (green,blue,white etc) . Try it out , measure the current across a voltage range and see how much it varies across a few same type leds.

Jul 27, 2014
189
10
We argue over an 1-cent part? Probably 0.1¢ in high quantity.

5. ### ErnieM AAC Fanatic!

Apr 24, 2011
7,906
1,789
Back in the 80's I used to take a small watch battery and surround it with heat shrink tubing leaving the ends open. Then I'd take a single LED with flat leads, clip the leads on an angle, then insert the leads thru my tie and back it with the battery.

When I used just the right LED and just the right battery combo it would run for about a day, as the LED drew enough current to have the batteries internal resistance work as the limiter.

Been 30 years since that and have not ever seen reason to repeat the experiment.

Get a resistor.

6. ### Externet AAC Fanatic!

Nov 29, 2005
955
93
If a led is rated -say- 1.88Vf, for a 10mA current; the led can be operated perfectly well with a fixed 1.88V supply with no need for any resistor, and will pass 10mA.

Check the data sheet for the particular led you want to operate with no resistor, apply that exact voltage Vf, and use an accurate voltmeter.
If no data sheet for it, measure Vf at the desired current. Then, the supply can be used without a resistor.

7. ### Alec_t AAC Fanatic!

Sep 17, 2013
7,023
1,453
You might get away with it. Or, LED dissipation = increased temperature = lower Vf = greater current = greater heating = .......POP!

8. ### MrAl Distinguished Member

Jun 17, 2014
3,617
760
Hi,

Sorry you can not look at the data sheet to find the voltage. They always show a range of voltages because the voltage of each one may be a little different and depending on LED one may draw much more current than the other at the same set voltage.

Yes it should be possible to run with a voltage source if it is matched to the particular LED, but only if the temperature does not change too much because that can cause higher current draw too.

9. ### djsfantasi AAC Fanatic!

Apr 11, 2010
3,486
1,245
What's going to limit the current at 10ma?

10. ### Externet AAC Fanatic!

Nov 29, 2005
955
93
The characteristics of the particular led.

For a certain forward current, there is a corresponding forward foltage.
And reverse, for a certain voltage applied to the led, a corresponding current will pass.
----> http://www.electronics-tutorials.ws/diode/diode12.gif
----> http://www.coilgun.info/levitation/images/led_conduction_curve.gif

Do not believe it ? Try it.

Leds are usually used with limiting resistors instead of fixed voltage driven because the supply in circuits they work is almost never the led steady Vf.

in the following, if the particular led is conducting 5mA, its VF is 1.8V
If 10mA is desired, the Vf becomes 1.85V; if 20mA is desired, the Vf becomes 2.0V.
----> http://www.nerdkits.com/media/forum-stuff/LTL-307EE-currentversusvoltage.png
And the reverse is also true: Applying 1.8V will conduct 5mA; applying 2.0V will conduct 20mA.
---->

11. ### geratheg Thread Starter Member

Jul 11, 2014
107
3
Thanks for all the replies.

This is how I understand it:
An LED rated at 3.6V and 20mA will draw 20mA if supplied with 3.6V. Correct me if I am wrong.
However, as it heats the current draw may vary. And in the real world, exact voltage is not the case.

So a resistor basically acts as a "safety" device in that it limits the current draw of an LED to safe levels, so it does not burn out. In addition, it gives the LED flexibility in case of a varying voltage, and thus protects it.

Another thing I realized is that an LED isn't a resistor like a regular light bulb (which I assume is basically a resistor), correct me if I am wrong about light bulbs being resistors.
Instead an LED is like a diode. Diodes have an exponential looking increase in current due to a slight increase in voltage once it passes a certain point.

If I said anything wrong or slightly off in this post, please correct me. If all is right I think my question has been answered.

12. ### MikeML AAC Fanatic!

Oct 2, 2009
5,451
1,070
Not quite. If you you buy a 100 LEDs (same manufacturer and type), at exactly 3.6V some in the lot may draw 1mA and others may draw 100mA and go Poof!

The batch of 100 is likely to be distributed along a bell curve, where most might draw closer to 20mA, but there will be outliers...

13. ### to3metalcan Member

Jul 20, 2014
234
25
Your understanding of LED's as diodes is right on, but lightbulb filaments are just a fudge more complicated...they ARE resistive, but they also CHANGE resistance depending on how much current is coming through them...as current goes up, resistance goes up, too. If you measured the resistance of a 100W lightbulb out-of-circuit, it's tiny. But if you connect it to 120VAC in series with a current meter, you'd only see about an amp coming through it...not dozens or hundreds of amps, as you might have expected. This is why I can use a "lightbulb limiter" as a safety device in my repair work...if I connect an amplifier with a short circuit to wall voltage, it'll blow fuses, possibly trip the breaker, melt down whatever the shorted component was, etc...but if I connect through a lightbulb fixture, the bulb will turn on and STOP the dangerously high current, as well as alerting me that there's a problem!

14. ### geratheg Thread Starter Member

Jul 11, 2014
107
3
Thanks! I'm assuming you mean 20mA of current produces a different forward voltage for each LED.

15. ### MikeML AAC Fanatic!

Oct 2, 2009
5,451
1,070
deleted message

16. ### MikeML AAC Fanatic!

Oct 2, 2009
5,451
1,070
That is correct, but that is not what you postulated. You assumed that if you apply a fixed voltage of 3.600V, you can expect the LED to draw 20mA. I am telling you that you may get a wide range of currents, even in the same batch...

If you apply exactly 20.00mA (using a constant current supply) to a batch of LEDs from some manufacturer, you will likely see a range of forward voltages from 3.3 to 4V...

Which is the cause and which is the effect??? Chicken or the Egg.

17. ### geratheg Thread Starter Member

Jul 11, 2014
107
3
Yup, my statement was in reverse. Makes a lot more sense as to why it's a current driven device.

18. ### wayneh Expert

Sep 9, 2010
13,435
4,272
It's like feeding a fixed number of calories to different people. Some will be just right, some will lose weight and some will gain weight and maybe even go pop. You need to adjust calories, or voltage, to the individual person, or LED.

It's easy to adjust a constant voltage supply to a safe voltage (ie. not towards the peak end of Vf) for an individual LED, and once you've set it, it'll be fine. Read the data sheets; the Vf vs I plot is much steeper than the Vf vs temperature chart. That means that temperature change may cause current and brightness to change a bit, but it is not going to carry you from the low end of Vf to the high end to cause a runaway condition. I agree if you're near the high end already, you are at risk. But not if you're driving a 20mA LED at, say, ≤10mA.

The point is usually moot, though, because as you've noted, we don't have perfect regulation outside the lab. To design for robustness in the real world with real parts, control current.

19. ### PaulEE Member

Dec 23, 2011
423
32
I'm a little late in this conversation, but would like to add that the little LED keychains that companies give away as advertising swag are usually a button cell battery and a white LED...and no resistor. An LED will light up to some extent if you're *close* to its turn-on voltage; perhaps this is how they safely eliminate the need for the resistor...or...the resistor may even be integrated (aside from the inherent impedance of the LED) INTO the LED...who knows.

So the answer to the OPs question is: it depends, at least in part, on...

supply voltage
LED forward voltage and current
LED impedance
Internal LED resistor?
maximum current out of supply

etc etc.

20. ### geratheg Thread Starter Member

Jul 11, 2014
107
3
I think it was discussed here that button cell batteries have decent internal resistance.

PaulEE likes this.