understanding LED driven by LM317

Thread Starter

yehezkel2

Joined Oct 2, 2009
31
Hello all,

i have a 5W power LED
i want to use the very common LM317 at current limiting mode, with the known schematic:



(let's suppose that i always feed with a 12v stable voltage at 10A)
as i understand it - i have to set R according to the LED's current (i know the formula)
and the input voltage of the LM317 can be 12v, 15v, or anything up to 37v.

now,
let's say i have 2 types of 5w LEDs:
1400ma @ 3.5v
700ma @ 7v

all i have to change in the schematic is the resistor, one calculated for 700ma, or 1400ma.

my question is: how the LM317 knows the LED's voltage ?

if i take a 5W led (1400ma @ 3.5v) and a 10W led (1400ma @ 7v)
the resistor setting is the SAME (both leds are 1400ma),
but suddenly the LED need twice as much voltage !
same case if i add LEDs in series, current stays the same but the voltage doubles....

does the LM317 knows that it has to double the voltage ?

finally, i mean, is the VOLTAGE parameter of a power LED can be ignored (as long as i supply a higher voltage) ?

and also, how can i be sure that the LM317 will deliver the highest possible voltage to the LED (within the safe margin) ?
OK, it will deliver 1400ma that the LED needs, but how can i be sure that it will give the full led's Vmax (3.5v) and not 3.4v or a bit lower (hence causing the LED not to be the brightest possible) ?

Thanks a lot !
 

Wendy

Joined Mar 24, 2008
23,421
From my article, LEDs, 555s, Flashers, and Light Chasers

Chapter 2,



LEDs are current devices, not voltage devices. The LM317 in this configuration is a constant current device, it will deliver the exact current the LED is rated for (with appropriate selection of the resistor), the LED dropping voltage is irrelevant as long as the power supply voltage exceeds the LED Vf by 3V or more.
 

Thread Starter

yehezkel2

Joined Oct 2, 2009
31
Thank you. i understand.

does this mean:


can i connect these two leds in parallel
5W led (1400ma @ 3.5v) and a 10W led (1400ma @ 7v)
and set the LM317 to 2800ma ?

that sounds a bit strange to me, like a missing parameter in the equation\formula...
 

Audioguru

Joined Dec 20, 2007
11,248
An LED sets its own voltage. You need to limit its current.

You should never connect LEDs in parallel because each one has a slightly different voltage.
If you connect a 3.5V LED in parallel with a 7V LED then their voltage will be 3.5V and the 3.5V LED will light and the 7V LED will not light. But since you set the current too high then the 3.5V LED will instantly burn out then the 7V LED will light for a moment and it will also instantly burn out.
 

Thread Starter

yehezkel2

Joined Oct 2, 2009
31
i understand it too.

now a simpler question:

two similar 1400ma @ 3.5v in SERIES
(which will give 1400ma @ 7v)

needs to be set to 1400ma according to what you say...

and so as well a serie of 3 leds resulting in 1400ma @ 10.5v
also needs to be set at 1400ma ?

i mean, after all, the LM317 has to give some voltage out of it,
so if i feed it with 30v, limiting the current according to the led's rating - what VOLTAGE is it going to output to the LED ? 30v ? 30-3v ?
will the LED get more than 15v and 'set its own voltage' ?
 

Audioguru

Joined Dec 20, 2007
11,248
When you set the LM317 as a voltage regulator then its voltage is set but its current changes with load changes.

When you set the LM317 as a current regulator then its current is set but its voltage changes with load changes.

If your LEDs have a total voltage of 10.5V and are fed a regulated 1400mA from an LM317 current regulator with a 30V input then the LEDs limit their voltage to 10.5V and the LM317 limits the current to 1400mA.
Then the LM317 heats with (30V - 10.5V) x 1400mA= 27.3W which will cause it to overheat then it will shut-down.
 

Thread Starter

yehezkel2

Joined Oct 2, 2009
31
ok i got it.

my idea was fixed at the voltage issue.
i thought that i have to give the LED their rated voltage (not over) and that they will draw the current they need...
(like a 100w lightbulb will draw only 100w even if the power source is unlimited)

thanks again !
 

Audioguru

Joined Dec 20, 2007
11,248
like a 100w lightbulb will draw only 100w even if the power source is unlimited
Absolutely not!
Your understanding of voltage, current and power is wrong.

A 100W light bulb uses 100W only at its rated voltage.
If the voltage fed to a light bulb is increased 10% then its power output and heating goes up 20%.
If the voltage fed to a light bulb is doubled then it power output and heating is increased 4 times and it will blow up.

A light bulb is a resistor.
An LED is a diode that is different to a resistor. If the voltage is increased a tiny amount then the current in a diode or LED goes up extremely high. That is why we feed a certain amount of current into an LED, not a certain voltage.
 

bountyhunter

Joined Sep 7, 2009
2,512
The LM317 needs a minimum of about 3V across it to maintain regulation. If you stack LEDs in series, make sure the total voltage is enough to let the LM317 work.
 

Thread Starter

yehezkel2

Joined Oct 2, 2009
31
so,

until now i used a 3.3v fixed voltage regulator to power up 3&5mm leds (20ma)
creating large arrays of even 50 leds (1000ma) the 3.3v regulator did a perfect job.

i was wrong to use it ? i had to use a current limiting ?

when i first tried a 3.3v regulator with a 5W led, id didnt light it at all.
although it gave 3.3 v like the led rating at around 2A - why is that ?
 

pandian

Joined Sep 27, 2009
33
Absolutely not!
Your understanding of voltage, current and power is wrong.

A 100W light bulb uses 100W only at its rated voltage.
If the voltage fed to a light bulb is increased 10% then its power output and heating goes up 20%.
If the voltage fed to a light bulb is doubled then it power output and heating is increased 4 times and it will blow up.

A light bulb is a resistor.
An LED is a diode that is different to a resistor. If the voltage is increased a tiny amount then the current in a diode or LED goes up extremely high. That is why we feed a certain amount of current into an LED, not a certain voltage.
Hi AudioGuru,according to ur LEDs characteristic, is it wrong to install LEDs in car (with high current source battery) by just regulating the voltage using 7805 and zener? my question is based on the ideas i got frommy previous post.link: http://forum.allaboutcircuits.com/showthread.php?t=28074thanks
 

Audioguru

Joined Dec 20, 2007
11,248
If you use a 5V regulator for LEDs then you will need many current-limiting resistors, one for each LED. But if you use LEDs in series, three 3.5V LEDs with only one current limiting resistor for them or five 2V LEDs in series with only one current limiting resistor for them you save a lot of resistors.
You will barely notice the LEDs dimming a little when the engine is turned off and the battery is not charging.

If you connect a 3.5V or 2V LED directly to the output of a 5V regulator (without a current-limiting resistor) then it will instantly burn out.
 

Thread Starter

yehezkel2

Joined Oct 2, 2009
31
what about my idea of using 3.3v voltage regulator (without current limiting resistors at all) ?

so,

until now i used a 3.3v fixed voltage regulator to power up 3&5mm leds (20ma)
creating large arrays of even 50 leds (1000ma) the 3.3v regulator did a perfect job.

i was wrong to use it ? i had to use a current limiting ?

when i first tried a 3.3v regulator with a 5W led, id didnt light it at all.
although it gave 3.3 v like the led rating at around 2A - why is that ?
 
Top