Ohm's Law - why LED always require resistor

Thread Starter

Kelvin Lee

Joined Oct 22, 2018
111
Dear Sir/Madam,

I want to have a very simple circuit, one 5050 3V SMD LED and one CR2032 battery cell, according to the datasheet

5050 LED http://ncled.koreasme.com/eng/download/solleds/5050-3Chip-specification_WHITE(6500K).pdf
Vf: 3.0 ~ 3.4V (0.1V Sorting)
If: 60mA

CR2032 https://html.alldatasheet.com/html-pdf/95177/SONY/CR2032/80/1/CR2032.html
Nominal Volage 3V

If connecting the LED and battery in series, an experienced engineer asks me to add 1 Ohm resistor between the LED and battery to protect the LED by limiting the current pass through the LED. According to the Ohm's Law, ( Vs - Vled ) / I = R, (3 - 3) / 0.06 = 0, what is the theory behind adding the 1 Ohm resistor?

Best regards,

Kelvin.
 

SamR

Joined Mar 19, 2019
5,040
The idea is to limit the LED to its rated 20mA or less. 1 Ohm won't do it depending on the circuit and I have seen it propsed that the resistor is like buying insurance and you don't have to. I do.
 

rsjsouza

Joined Apr 21, 2014
383
As others have said, the 1Ω resistor will not do much else other than dissipate heat. The current limit action would be way too small to make any difference.

In another angle, using a CR2032 to power such LED is giving it too little useful life. The LED datasheet mentions a nominal current of 60mA @ 3V and a typical datasheet for an Energizer CR2032 states a capacity of 235mAh for a current 300 times smaller. The lifespan of the CR2032 will be quite minimal - at this current level, its internal resistance may severely reduce the output voltage to less than 3V.

If this is just an experiment, then all these considerations can be thrown out. If it is for a product, a more powerful source would be needed (or a different LED).
 

SLK001

Joined Nov 29, 2011
1,549
An LED doesn't need a resistor in series IF you put the correct Vf across it. It is done like this all the time. An LED is NOT a bottomless current sink that needs "protection".
 

BobTPH

Joined Jun 5, 2013
8,962
Okay, so what is the “correct” Vf when the datasheet gives a range of, say, 3.0 to 3.5V?

And what happens when you supply the Vf to get the max rated current, and then the LED heats up?

Bob
 

WBahn

Joined Mar 31, 2012
30,058
Dear Sir/Madam,

I want to have a very simple circuit, one 5050 3V SMD LED and one CR2032 battery cell, according to the datasheet

5050 LED http://ncled.koreasme.com/eng/download/solleds/5050-3Chip-specification_WHITE(6500K).pdf
Vf: 3.0 ~ 3.4V (0.1V Sorting)
If: 60mA

CR2032 https://html.alldatasheet.com/html-pdf/95177/SONY/CR2032/80/1/CR2032.html
Nominal Volage 3V

If connecting the LED and battery in series, an experienced engineer asks me to add 1 Ohm resistor between the LED and battery to protect the LED by limiting the current pass through the LED. According to the Ohm's Law, ( Vs - Vled ) / I = R, (3 - 3) / 0.06 = 0, what is the theory behind adding the 1 Ohm resistor?

Best regards,

Kelvin.
IF you have an ideal 3 V voltage source (and a CR2032 cell is FAR, FAR from an ideal voltage source), then when you connect it to an LED the current that will flow will be whatever current is consistent with having 3 V across it. At low currents, an LED looks similar to an ideal diode and has an exponential voltage-current characteristic, so small increases in voltage result in large increases in current. For higher currents there are other effects that start to dominate and the characteristic for most LEDs flattens out, at least to some degree, and starts looking more linear (like a resistor).

So the more ideal the supply looks and the more ideal the diode, the more sensitive the current will be to small differences in the voltage. In an ideal diode near room temperature, the current will increase (or decrease) by a factor of ten for every (roughly) 60 mV increase (or decrease) in voltage. So if you increase the voltage by just 0.240 mV (again, on an ideal diode) the current will increase by a factor of 10,000. If the voltage was low enough originally that the current was only 1 uA, then now it will be 10 mA and things will probably be about right. But increase it by another quarter of a volt and you will be up in the 100 A range. Unless the diode is intended to handle those kinds of currents, it will burn out. For most LEDs (because they are generally pretty poor diodes since that's not what we use them for) the increase is a lot tamer but you can still easily burn them out if you don't limit the current.

Which is what the resistor does.
 

crutschow

Joined Mar 14, 2008
34,442
An LED doesn't need a resistor in series IF you put the correct Vf across it.
True in theory but not in practice.
The "correct" voltage depends upon the particular LED you have and the ambient temperature.
The current through an LED is very sensitive to the voltage, as it is a diode after all, so normally a resistor is used in series to regulate the current, or a constant-current circuit to make the current independent of the supply voltage.
It is done like this all the time.
Only if the supply already has a significant internal impedance, such as a coin cell.
An LED is NOT a bottomless current sink that needs "protection".
It may not be "bottomless" but it will certainly draw a damaging amount of current if the applied voltage is just a little high and there is no other series impedance.
In most applications it definitely needs protection.
 
Last edited:

dendad

Joined Feb 20, 2016
4,476
It is quite unwise to run an LED without a current limiting resistor.
Yes, under carefully controlled conditions you may be able to do it, but why would you be so silly just to save a resistor that costs so little?
This question keeps coming up.

An LED is a current operated device, not voltage. The quoted voltage is the typical forward voltage drop across it when it is operating, not the voltage you should run it from.
Pay attention to the CURRENT and control that with an appropriate resistor.
Those cheap toys that just have a coin cell and no resistor rely on the internal resistance of the battery, so they still do in effect have a series resistor.

If you want to see how sensitive the LED is, connect it directly without a resistor to a variable power supply and look at the current as you slowly increase the voltage.
Wear safety specs!
 

ci139

Joined Jul 11, 2016
1,898
a one option
still requires a current limiting or resistor that matches (takes a voltage drop to match that of) a specific LED Dummy-LED-s_22.png ← no need extensive OUTP filtering but the led would draw more power at output peaks without . . . + an overkill op amp regulated Dummy-LED-s_21.png (it can be made more quiet . . . but it was quick compiled to illustrate constant 8mA outp)

Wear safety specs!
or use appropriate (adjustable/adjusted) fold-back supply or pre-limit the current to the LED's max continuous forward ...
...or use a special trim-pot (Fig.1) or make one yourself src. (Fig.2)
 
Last edited:

WBahn

Joined Mar 31, 2012
30,058
It is quite unwise to run an LED without a current limiting resistor.
Yes, under carefully controlled conditions you may be able to do it, but why would you be so silly just to save a resistor that costs so little?
A few reasons are possible.
1) You don't have the physical space available.
2) You don't have enough voltage overhead.
3) You are making millions and millions of units.

Hobbyists and such seldom have a viable reason for not using a proper current limiting resistor (or other current-limiting technique). Others that do have a viable reason/need to do so should do it as the result of a proper design analysis.
 

MrChips

Joined Oct 2, 2009
30,807
You don't need a resistor.

If the LED draws 20mA @ 3V, for example, and your battery puts out exactly 3V then the LED will draw 20mA.

However, if I were at my computer I would show you graphically why this is a very bad idea. Moreover, as others have pointed out, a 1Ω resistor is a terrible idea.

To put it in words, you need a realistic current source.
An ideal current source would have a very large resistance and very large voltage. Now you have to extrapolate that backwards to a realistic solution. You need a voltage source that is much higher than the LED voltage and a series resistance that will give the proper LED voltage and current.
 
Top