Overdriving power LEDs

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
I have done an experiment with a 10W red power LED (660nm) which I have cut open partially. Only one of the LED strings still work!

I use a 12V electronic transformer. Current is 0.5A, so a 3W string gets 6W!
It is mounted on a large heatsink which turns fairly hot, about 65C. No fan.

After 24 hours, no brightness detoriation.

Connecting a regular 10W LED causes the voltage to drop slightly, so the current is only 0.7A

I have taken apart the 20W red LED lamps, because they develope too much heat. Each LED has now it's own heat sink. I use VGA coolers now, which are better for cooling! Soon I will build voltage regulators for them.

And I want to overdrive them a little, since these 660nm LEDs are not very effective. It will be some kind of experiment, if these LEDs can tolerate overdriving.

As far as I understand, the key seems to be good cooling!

All the other LEDs remain in stable condition.
 

Attachments

GopherT

Joined Nov 23, 2012
8,009
Heat is an issue. The higher the temp, the harder it is to get the heat away and soo , the LED die, the little chip in the component, will generate heat faster than it can discipate heat and you get thermal runaway and your LED is fried.

In summary, no, a little extra milliamperes will ot immediately kill your LED but it is hard to know exactly when too much is too much. And, when you finally find that point, duplicating it will be difficult because your heat sink will not be exactly the same.
 

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
No of course duplicating is not easy. Each LED is different for heat. Each heatsink is different. I measured VGA coolers some while ago (not with LEDs), they can carry away upto 50 Watts (the small cheap one's).

The black cooler I use here is just passive without fan.

And it is really full 200% of rated current. I will try the same on full LEDs soon, and test for some months, and observe eventual detoriation.

3W LEDs are also interesting to drop voltage. I use one for a 12V fan, it does exactly the right amount of voltage drop! And since today, I use 2x 3W red LEDs to drop voltage for a blue LED lamp. Before of that, I wasted the power with 8 diodes.
 

thatoneguy

Joined Feb 19, 2009
6,359
Careful when doing such with White LEDs. They are essentially UV LEDs with a phosphor coating to emit "warm white" or "cool white" depending on the mix.

If the phosphors are severely over-excited, they stop fluorescing, and the bright white turns into a dim bluish LED.
 

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
Ah, this is interesting. This explains why the 30W LED only increased a little in brightness when I did run it at 2.28 Amps for a second. No way a double increase in brightness.

But they are very bright already. Only the 660nm are not so much effective.

I have also seen a chart beyond 600nm the eye sensibility drops remarkably.
So maybe they are 3x or 4x brighter, if it would be regular red.

I did not know that before, true I know the white LEDs use phosphors, but I did not know the phosphor has a limit.

Actually I mounted 2x red LEDs on VGA coolers, as well one 2576 IC. Want to wire them up later today, and see how much I can increase current. I will maybe run the LEDs at 1.5 Amps, 2 Amps is a little to risky. I will try anyway and see if it makes a reasonable gain in brightness.

And the LEDs I have here are pure white, which has more red than cool white, but looks better than warm white. I have 3 of these running now, at 32 volts, which is just a bit below the voltage for 1 Amps current. The white LEDs are not so much a problem, but the blue lamp is pretty much at the limit.

It is still less than 100C, but I guess 80C at least, don't want to run them at higher temp.

The red lamps turned very hot, since I had them directly on DC, and maybe grid voltage was a few % higher. The silicone discolored!

Anyway, the silicone is still fine, full adhesion, which I observed when I removed one each from the sinks.

But now they also get ICs to regulate the voltage.

One time, current was only 0.7A, and then it must have exceeded that, because the silicone discolored.

Thermal runaway is only a problem for the blue LEDs. Since I have now 2x 3W red LEDs in series (which have a good cooler), it is no longer a problem. These LEDs are very very bright actually, a pain to look at directly.
 

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
Here my research subjects :)
Note actually the silicone on the other lamps did not discolor at all.

And it's not regular so-called silicone. I call it silicone because it is very likely based on silicone. So far it is doing it's task very well, still full adhesion even for the overheated lamps.

It can also be removed with some force. But bonding is strong enough it won't come off too easily. Just right.

Imagine how easy semiconductors can be cooled, just glue them to a VGA cooler. No screws or holes needed at all! And VGA coolers only cost a few $, have a highly effective fan integrated.
 

Attachments

vrainom

Joined Sep 8, 2011
126
This explains why the 30W LED only increased a little in brightness when I did run it at 2.28 Amps for a second. No way a double increase in brightness.
Also, our eyes don't perceive light increments linearly but logarithmically, so in order to see your light source twice as bright it would need to have four times its luminic output. A better measure would be with a lux meter.
 

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
Yes I was considering to get a lux meter eventually.

Here a LM2576 regulator I built to drive 2x red LEDs.

Input voltage is 32V and output is about 24V.
 

Attachments

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
A voltage regulator.

I was thinking about overcurrent shutdown.
All what's needed is an OpAmp, a shunt resistor, and to use the shutdown/enable pin of the LM2576.

But, if I wanted to sell it or use it professionally, I'd rather monitor the temperature with a controller- I could use one controller for many temp. sensors.

It's also a bit difficult to adjust- the useful range is only some 1 volts or so, a little more if 3 LEDs are in series. It's not really guaranteed the cheap adjustable resistor will stay that way over years :)

What I want to research is the silicone stuff- if it will be durable for some time.

And as in the thread title, this time for the first time being, I will overdrive the LEDs on purpose.

I had success with 3W/6W (one string from a 10W LED), but it is unknown if it will work for a full LED.

My idea is only the temperature is the limiting factor (and the phosphor).
 

Audioguru

Joined Dec 20, 2007
11,248
LEDs are supposed to be powered by a current source, not a voltage source.

LEDs set their own voltage which decreses as they heat which causes their current to increase when powered from a voltage source. Higher temperature causes higher current which causes higher temperature which causes higher current which causes ...
It is called "thermal runaway".
 

Markd77

Joined Sep 7, 2009
2,806
Most adjustable voltage regulators can be used as constant current sources by calculating a sense resistor to go in series with the load so that the voltage across it is equal to Vref (1.23V for the LM2576).
If you add an op amp (see link) you can use a smaller sense resistor for less wasted power, and makes it easier to adjust the current.
www.ti.com/lit/ds/symlink/lm2576.pdf
 

Thread Starter

takao21203

Joined Apr 28, 2012
3,702
LEDs are supposed to be powered by a current source, not a voltage source.

LEDs set their own voltage which decreses as they heat which causes their current to increase when powered from a voltage source. Higher temperature causes higher current which causes higher temperature which causes higher current which causes ...
It is called "thermal runaway".
The topic here is overdriving LEDs, power LEDs in this case.

This has to happen in a matter the actual current is well known.
 

THE_RB

Joined Feb 11, 2008
5,438
If you used a 34063 IC as a SMPS IC it has one resistor that sets the current limit, and it will happily drive a power LED as a constant current SMPS buck driver.
 

thatoneguy

Joined Feb 19, 2009
6,359
LEDs are supposed to be powered by a current source, not a voltage source.

LEDs set their own voltage which decreses as they heat which causes their current to increase when powered from a voltage source. Higher temperature causes higher current which causes higher temperature which causes higher current which causes ...
It is called "thermal runaway".
I've essentially given up on this topic with takao. You can probably notice why by reading this thread. :D
 

vrainom

Joined Sep 8, 2011
126
I've seen in some datasheets 60ºC as the maximum recommended temperature, but the actual temperature of the die is gonna be hard to know without knowing the thermal resistance of the device. What temperature are you aiming to sustain in the leds?

Also, why not trying high current pulses? Continuous current is gonna shorten your leds life a lot. In a datasheet I saw for a 10w white led the recommendation was 3A 10% duty cycle @ 10khz. This is where a lux meter would come handy, to compare luminic output and degradation over time.

By the way, what brand is that thermal glue you're using?
 
Last edited:

Audioguru

Joined Dec 20, 2007
11,248
10% duty cycle dims the LED. A LUX meter is linear and shows the light output. But our vision's response to brightness is not linear, it is logarithmic.
 

thatoneguy

Joined Feb 19, 2009
6,359
He should make a video of the LEDs burning up.
With White LEDs, it's a slow process. A fun thing to do is buy 2 of the $5 High Output 3W LED lights from Amazon. Don't turn one on at all. Run the other one until the batteries are dead a few times, letting it get warm but not too hot to touch.

After about 2 weeks of this, put new batteries in both the unused light and the one you used. Turn them on, point at a white wall 8 feet away and take a picture. Results are sadly humorous.

Brightness between the two drops by at least 50%, but people that only use the one light don't notice it until it drops down to about 30% output, and since most people don't use a light that frequently, the warranty is gone by then, and it "seems bright" the entire time when they have nothing else to compare it to.

They don't burn out, they just put out "warmer" light, for some reason the hue goes yellow, then red to dim orange, and less visible light overall (UV out increases).

For Semiconductor colors, the intensity changes, but the wavelength doesn't due to the doping producing the same frequency of light, just at a lower amplitude over time. If you don't compare to "new/unused" lights after a lot of use, the drop in output isn't noticeable with very bright LEDs until they either fail completely, or drop about 2/3rds in output.
 

vrainom

Joined Sep 8, 2011
126
10% duty cycle dims the LED. A LUX meter is linear and shows the light output. But our vision's response to brightness is not linear, it is logarithmic.
I know it does, but that's the max pulsed IF admitted in the datasheet. And that's exactly why you need a lux meter, to measure the effective luminic output of the light source, and not only have a subjective opinion.
 
Top