Using an LED without series resistor powered with a 555

Thread Starter

Goxeman

Joined Feb 28, 2017
171
Hello,

My question is simple. I am trying to simplify a circuit, so I am trying to reduce the components used.

If I power a 555 IC with 5VDC, the output drops 1.4VDC from VCC so I made my real measures and the output is max 3.58VDC. If I use a blue LED that normally needs 3.5-3.6VDC, would you save the LED series resistor? In case of using it it would be just 1 to 5 ohms resistor.

Do you know if the 555 IC has any internal resistence?

I want to use like 4-5 LED so that would mean like saving 4-5 resistors
 

dendad

Joined Feb 20, 2016
4,476
Although you may get away with no resistor (some cheap devices do), it is never a good idea to omit the resistor.
LEDs are current operated devices, not voltage.
They just happen to have a typical voltage drop when operating.
Resistors are cheap. Well worth including.
 

Bernard

Joined Aug 7, 2008
5,784
Not recommended on AAC but it will work. Early LED flash lights used 3 AA 1.5 V batteries = 4.5 V fresh, with white LED & no series resistor. Assuming that this is not a life critical application. Resistors are cheap. 555 about US $ 1.00. A well regulated supply could pose a problem, flash light relied on battery internal resistance. Let's see other comments.
 

Thread Starter

Goxeman

Joined Feb 28, 2017
171
I understand, it is not about the price. It is about the space

My question is more focused on the 555 output. Would it protect the LED somehow? It would mean using a resistor with almost no resistance :rolleyes:
 

dl324

Joined Mar 30, 2015
16,914
It is about the space
How much space is available? You could stand the resistors up, use a SIP resistor network, or surface mount resistors.

You could put LEDs of the same color in series as long as the total forward voltage is low enough. If you don't care about relative brightness, you can mix colors in the same string.
 

BobTPH

Joined Jun 5, 2013
8,952
I understand, it is not about the price. It is about the space

My question is more focused on the 555 output. Would it protect the LED somehow? It would mean using a resistor with almost no resistance :rolleyes:
No, a 555 can output 10 times the current of a typical LED.

Bob
 

BobTPH

Joined Jun 5, 2013
8,952
Being close to Vf of the LED doesn't avoid the need for a resistor. The resistor is for current limiting.
And it can only limit the current if there is some headroom between the supply voltage and the Vf.

The problems with driving an LED with a voltage are:

1: The forward voltage is not well controlled. Multiple LEDs of the same type may have Vf varying by hundreds of millivolts at the rated current.

2: The Vf of an LED decreases with an increase in temperature. This can lead to thermal runaway. The LED heats up because it is dissipating power. The Vf drops, leading to more heating. The end result can be destruction of the LED.

A resistor counters both of these problems by providing negative feedback. An increase in current causes a decrease in voltage to the LED because more voltage is dropped across the resistor.

So, back to the headroom. This negative feedback depends on the fact that a significant fraction of the voltage is dropped across the resistor. You need at least 1V across the resistor to achieve the needed feedback.

In other words, using an LED with a Vf of 3V at 20mA and using a 3.3V supply. the resistor is 15 Ohms, and it aint going to cut it. Suppose your LED is really 2.8V, which is in the range you might find for an LED with a typical Vf of 3V. Now you need the 15 Ohm resistor to drop 0.5V, which it will do at 33 mA.

Bob
 

Thread Starter

Goxeman

Joined Feb 28, 2017
171
And it can only limit the current if there is some headroom between the supply voltage and the Vf.

The problems with driving an LED with a voltage are:

1: The forward voltage is not well controlled. Multiple LEDs of the same type may have Vf varying by hundreds of millivolts at the rated current.

2: The Vf of an LED decreases with an increase in temperature. This can lead to thermal runaway. The LED heats up because it is dissipating power. The Vf drops, leading to more heating. The end result can be destruction of the LED.

A resistor counters both of these problems by providing negative feedback. An increase in current causes a decrease in voltage to the LED because more voltage is dropped across the resistor.

So, back to the headroom. This negative feedback depends on the fact that a significant fraction of the voltage is dropped across the resistor. You need at least 1V across the resistor to achieve the needed feedback.

In other words, using an LED with a Vf of 3V at 20mA and using a 3.3V supply. the resistor is 15 Ohms, and it aint going to cut it. Suppose your LED is really 2.8V, which is in the range you might find for an LED with a typical Vf of 3V. Now you need the 15 Ohm resistor to drop 0.5V, which it will do at 33 mA.

Bob
Thank you for your accurate answer. It is very interesting, I didnt know about this headroom needed. It makes sense indeed once I read what you explained

Thank you for sharing :)
 

eetech00

Joined Jun 8, 2013
3,946
Hello,

My question is simple. I am trying to simplify a circuit, so I am trying to reduce the components used.

If I power a 555 IC with 5VDC, the output drops 1.4VDC from VCC so I made my real measures and the output is max 3.58VDC. If I use a blue LED that normally needs 3.5-3.6VDC, would you save the LED series resistor? In case of using it it would be just 1 to 5 ohms resistor.

Do you know if the 555 IC has any internal resistence?

I want to use like 4-5 LED so that would mean like saving 4-5 resistors
If the LED's are all the same type and color, you could use an external current limiter and wire the 5 LED's in parallel. There would still be some variation in Vf/If but it would work. The current limiter would add two resistors, two 2N3904, and 1 2N7000. I haven't tried this on bench but works with NE555 in simulation.
 
Top