LED driver

Thread Starter

manuel09

Joined Aug 29, 2009
15
Sorry if seemed stupid question, why most of people uses led driver?,
I just tried like this and looks havent problem. each channel tested and got the same current abt 750mA and voltage stable at 22.5V.

2 parallel x 6 leds AVAGO white per series by universal SMPS power supply 23dc ( adjustable 21-28Vdc ) + resistor 2.7ohm 2W
by smps PSU its more cheaper than common led driver. It has a bad affect ?

Othercase by the same leds i use DC-DC led driver per series after SMPS but nothing different?. please advise me.
 

PentodePuppy

Joined Aug 19, 2009
12
...why most of people uses led driver?
LEDs are current mode devices. It's the current that needs to be regulated. This can be done with a simple resistor, a linear current regulator or with an elaborate switch mode current regulator (i.e. PWM).

The resistor solution is cheap but has the drawback of either wasting power or not doing a good job of regulating the current.

An ideal current regulator has an infinite impeadence and can supply whatever voltage is required to maintain the target current in the circuit. This can be mimicked by using a large resistance value in series with the LED. The larger the value, the better the regulation. BUT, the larger the value, the higher the supply voltage has to be to cause the desired current to flow through the LED (the target current). The larger the supply voltage, the larger the voltage across the resistor:
Vs = Vr + Vled
where Vs is the Supply Voltage, Vr is the voltage across the resistor, and Vled is the LED voltage
The larger the voltage across the resistor, the more power is wasted:
Pr = Vr * Iled
where Pr is the power dissipated by the resistor, Vr is the voltage across the resistor, and Iled is the LED current (the target current)
For LEDs spec'ed to run at 20ma or less, this is usually of minimal consequence. BUT, there is another consideration. The voltage across an LED can change. I've seen it change over time for LEDs that are constantly powered, and I have seen it change with temperature, and LEDs with the same part number can have different drive voltages--especially for the higher frequency super bright LEDs (e.g. Green and above). This is why it is important to have adequate current regulation in an LED driver circuit.

It all comes down to design goals. If this is a circuit that is going to be mass produced, then LED drive voltage variation becomes an issue and you will want a solution that isn't vulnerable to that. But, if you are a hobbiest, and this is a onesy-twosy kind of thing, and it's going to be run indoors and not in an environment with moderate to extreme temperature swings, or a production environment where parameter tolerance is an issue, or an application where power efficiency is crucial (such as in a battery powered design, or where extra heat is an issue), then you probably can get away with measuring the voltage across the LED, and then calculating the proper resistor for that LED.

Then there is the high power LED (e.g. 1Watt, 2Watt, etc). Driving these LEDs with a resistor, though possible, is extremely inefficient, as the resistor, in order to do an adequate job of regulating the current through the LED, must be fairly large in value and thus will dissipate a great deal of power -- as much as, if not more than, the LED it's self (but, again, this is influenced by design goals).

It's possible to design a linear current regulator that does an excellent job of regulating the current with medium power efficiency. BUT, the best way to do this is with PWM (with current sense and feedback for regulation). With the proper switch (i.e. transistor), most of the power is delivered to the LED and very little is lost in the switch.

Another advatage of current regulation is you can string LED in series (even ones of different colors) and drive them all with the same current regulator--as long as the regulator can supply a voltage large enough to drive them all [and remember to consider the maximum LED voltage specification -- if that is an issue].

Good current regulation, however it is achieved, is the best way to deal with LED drive voltage variance. Switch mode solutions are the best way to achieve good current regulation PLUS excellent power efficiency.
 
Last edited:

Wendy

Joined Mar 24, 2008
23,415
Did you read the article? This was pretty much stated there.

A fixed voltage supply is important to keep the current constant. Contrary to popular belief, you don't need a constant current source, just a stable source of current.

Puck bucks and their ilk are the best solution for high power LEDs, mostly because they are converters. 9V at 350ma will feed the LEDs with 700ma. However, there are lots of cases where high power LEDs are treated much like their low power cousins and resistors are used.
 

electrotech

Joined Feb 26, 2010
19
you do need a constant current power supply otherwise as the LEDs heat up they will start to draw more and more current until they smoke. pentode puppy is correct. If you dont believe me hook up an amp meter in series with the led from the driver and you will notice that current draw will climb for about an hour.
 

Wendy

Joined Mar 24, 2008
23,415
Uhh, no. Do the math. The only way an LED will draw more current is if the Vf drops, and a couple of tenths of voltage change will not create a huge change of voltage. The resistors, not the LEDs, are what control current, so thermal runaway is generally not possible.

Math is your friend in this. The more voltage the resistor drops the less effect the LED has.

You have .7A from a 12V source feeding a 3.6V LED. This means the resistor is 12Ω, and the LED is 2.52W.

If the LED is shorted the current will be 1A. However, this is not what will happen. If the LED reduces its drop to 3.3V then the current is 0.725A, and the LED is running 2.39W, a reduction in operating temperature, which is also a negative feedback mechanism.

Where it can break down is if you run a string of 3, then the resistor has less effect.

In this case 3 LEDs dropping 3.6V at 700ma have a limiting resistor of 1.714Ω (at such small values the exact values become more important). If the resistor were 1.8Ω the current is 666ma. So if the LEDs change to 3.3V Vf the current changes to 1.23A, which will indeed fry your expensive LEDs, as the wattage climbs to 4W

It is the design of the resistors, how the resistors are used, that make the difference. If you are using resistors then you need to be aware of how to use them, and try not to lean on the LEDs specs too much.

I've done a lot of constant current designs over time, most cases you can use a simple transistor or a LM317. I like puck bucks because they are much more efficient, so less wattage overall is wasted. You current regulation component generally gets extremely hot, requiring substantial heatsinks. IMO it is almost a paradox that heatsinks are designed for transistor and IC cases whereas resistors not so much.

The high power LEDs also require good heatsinks, no getting around it. If you can suck the thermal energy away and keep them constant temperature they will be much happier devices.

The reason I emphasize a stable power supply is one of the main sources of power is not stable. Cars voltage can run from 11V to 13.7, which I would not define as stable, and is one of the things you have to look at.

Many cases we are saying the same things, but coming at it from different angles. I dislike ruling resistors out altogether though, since there are many cases they will work just fine.

One of the constant current sources I like is this one, look at how much wattage the resistors have to absorb vs the transistors.



I should have put a 0.1µF capacitor after the LM317, since it is a voltage regulator in this configuration.
 
Last edited:

Bosparra

Joined Feb 17, 2010
79
My questions is related to LED drivers, so I hope this is not a hijack.

How much current regulation is expected to be practical, in the above circuit i.e. how much does the current drift between 13.7V and 10V?

The reason I ask, is because I've been playing around with a similar circuit as the one above, except I used a N channel jfet sinking two strings of 3 white LED's to ground. Each of the 3 strings have a 100Ohm resistor in series with the 3 LEDS. The jfet is a 2n7000 which gives roughly 40mA at Vgs 4.2V. To keep Vgs at 4.2V, I used a normal diode and a 3.6V Zener in series, between gate and source and then ground.

Taking the voltage down to 10V, drops the current to about 20mA and at 14V it goes up to about 50mA, with a visible difference in brightness of the led's. This does not strike me as a very good current regulator. Am I right?

To state my question differently, how much variation from 40mA is acceptable, when the voltage changes between 10v and 14V, for a decent current regulator.
 

retched

Joined Dec 5, 2009
5,207
Well, it is a hijack.
You will do better to start your own thread and link back to this one.

Do you have the datasheet for the regulator you are using?
 

Wendy

Joined Mar 24, 2008
23,415
LEDs are not very linear in their brightness. You might see a light level change, but you might not either.

Figure worst case, 13.7V, and set the current to the max you want. Then do the math for the other end, 10V, and see where the current is.

If you have linked a bunch of LEDs together you have a problem, but if there is only one LED in this circuit the change will be very small. Since the current goes down, it will not put the LED at risk.

Math is your friend, so are schematics. A simple schematic will clear up a lot of misunderstandings, such as multiple LEDs in a chain.

The link I gave have a lot of answers to your questions.

LEDs, 555s, Flashers, and Light Chasers

LEDs are very simple devices, I'm seeing a lot of myths starting to develop, such as the constant current source being absolutely necessary.
 
Last edited:

Thread Starter

manuel09

Joined Aug 29, 2009
15
I just tried to use LM317 input 24vdc output 21.5~22V and have a good result , Thank you for all advises.
(240R+trimmer 5K) direct to 6leds, at this time just try without additional resistor stable at 490mA. I would like to drive at maximum 750mA by change the resistor value.
Prevent overheat, I need to add temperature controller please advise for simple scheme precision.
Assumed critical temperature 50deg C and ideal 35-38C max.
 

Wendy

Joined Mar 24, 2008
23,415
Not a temperature controller, a heat sink. The two are very different. If the heat sink by itself isn't cutting it, then add a fan to cool the heatsink, which cools the device, whichever one you were talking about.
 
Top