resistor on led

Thread Starter

h2opolo

Joined Aug 2, 2013
38
From what I have read, I understand that it is important to use a resistor in series with an led. The reason is that the led does not have a linear relationship between voltage and current. Without a resistor, the current has the potential to be too high and burn out the led. Led's have forward voltage ratings. I was wondering if you could tune your voltage supply to put out the exact forward voltage and not require the use of a resistor. The reason I ask is that I see people connecting high powered led's to variable power supplies with no resistors. It seems to work fine.

Is this a bad idea? If you were to eliminate the resistor with a high powered rgb would you have calibration problems in mixing the colors due to fluctuations in the current?

Thanks
 

blocco a spirale

Joined Jun 18, 2008
1,546
You must drive an LED from a constant current source or some form of current-limited supply. There is no exact forward voltage with an LED as it is somewhat temperature dependent.

LEDs are all about current; the light output from an LED is proportional to its current so attempting to drive an LED with a fixed voltage is a worthless exercise as you would have no idea how hard the LED is being driven if at all.
 
Last edited:

eetech00

Joined Jun 8, 2013
3,961
From what I have read, I understand that it is important to use a resistor in series with an led. The reason is that the led does not have a linear relationship between voltage and current. Without a resistor, the current has the potential to be too high and burn out the led. Led's
Yes. The current through the LED must be limited in some way or the LED will burn out.

have forward voltage ratings. I was wondering if you could tune your voltage supply to put out the exact forward voltage and not require the use of a resistor. The reason I ask is that I see people connecting high powered led's to variable power supplies with no resistors. It seems to work fine.
The power supply probably has current limiting such that it limits the current through the LED so it doesn't surpass its max forward current, or, the LED might have an internal current limit resistor.

Is this a bad idea? If you were to eliminate the resistor with a high powered rgb would you have calibration problems in mixing the colors due to fluctuations in the current?
Thanks
Not a good idea. RGB LEDs require different current values for the different colors, usually RED requiring less Vfwd@Ifwd than Green or Blue for the same intensity.

eT
 

Thread Starter

h2opolo

Joined Aug 2, 2013
38
Thanks for the all the replies. I looked up the voltage source and high powered led's that I saw people using. The led's had no internal circuitry that added resistance and the voltage supply had no current limiter that you could adjust. It does have a max current which is much higher than what the led is supposed to get but it doesn't allow the current to run away like you suggested. I think they just put together a circuit that works but is not optimal as the led will probably burn out after a while because of the excessive current.

After reading what everyone has to say, it sounds like I need a resistor. I was planning on three logic level n type MOSFET's to regulate the brightness of an RGB with PWM. I want to power the RGB with a common anode then select the proper resistors for the R, G, and B channels. It sounds like if I want to power these led's I need to find a voltage source that provides voltage higher than the forward voltage of the led so that I have a voltage drop large enough to add a resistor.

Is that correct? Also, if I am using a variable voltage source, I would prefer to keep the voltage as low as possible but if I get too close to the forward voltage of the led, the resistor value would be too small and I am guessing the circuit wouldn't be as stable. Is there a way to optimize how much voltage I need and the size of the resistor I need?
 

eetech00

Joined Jun 8, 2013
3,961
After reading what everyone has to say, it sounds like I need a resistor. I was planning on three logic level n type MOSFET's to regulate the brightness of an RGB with PWM. I want to power the RGB with a common anode then select the proper resistors for the R, G, and B channels. It sounds like if I want to power these led's I need to find a voltage source that provides voltage higher than the forward voltage of the led so that I have a voltage drop large enough to add a resistor.
You don't necessarily need a resistor, only a way to precisely control, or limit, the current thru each LED. PWM drivers provide a constant current to the LED but control the brightness by varying the duty cycle.

Here's a link to some info:
http://www.st.com/st-web-ui/static/...ical/document/application_note/CD00157323.pdf

eT
 

Thread Starter

h2opolo

Joined Aug 2, 2013
38
No resistors would make me happy. That is a very detailed link. Thank you. I may have to read it in its entirety before I buy any more parts.
 

k7elp60

Joined Nov 4, 2008
562
Here is a simple circuit that limits the current. With a 9V battery you can put 2 possibly 3 LED's in series and not need a resistor for the LED's IC1 is a e terminal device that limits the current. A LM317LZ is a much smaller version and will handle about 100Ma. Most LED drivers have some way of limiting
the current. To say the LED doesn't require a resistor can be incorrect.
Different color LED's require different current to have the same intensity as the eye see's them. It can be a complicated thing to get all the colors the same
intensity because of the difference in eye sensitivity and LED brightness according to current.LED CURRENT.jpg
 

wobla

Joined May 24, 2015
1
Basic question from a newbie - I now understand the need for a series resistor with an LED, but does the series resistor itself eject energy in the form of heat or is the resistor an efficient part of the circuit?
 

Austin Clark

Joined Dec 28, 2011
412
Basic question from a newbie - I now understand the need for a series resistor with an LED, but does the series resistor itself eject energy in the form of heat or is the resistor an efficient part of the circuit?
The resistor will indeed dissipate energy as heat. The closer the input voltage to the forward voltage of the LED, the less voltage the resistor will need to drop, as thus less power will be wasted.

If half the input voltage is dropped by the resistor, half of the power used in the circuit will be wasted as heat in the resistor. If 1/4 the input voltage is dropped by the resistor, 1/4 of the power will be wasted as heat in the resistor, etc;

For example, you could run a single LED on 100V DC, but a it will be very inefficient, and the resistor will dissipate lots of heat.
 
Top