# If LED intensity decrease over time which parameter(s) will change(s)

Thread Starter

#### Nogrend

Joined Mar 31, 2017
1
Hallo,

I’m working on a project about intelligent lighting. A short introduction: a lamp that monitor itself and ask for help is something goes wrong and will pay the service invoice with blockchain technology.

I’ve a question about the parameters a LED-driver can provide.

Information on the internet tells me that a LED intensity decrease over time. So, L70 at 50.000 hours means that 70 percent of the initial light is emit around 50.000 working hours. 30 Percent is disappeared, but that 30 percent of light energy can’t be gone. Is that 30 percent convert to heat? Is the forward voltage value changed?

So, can the voltage over the LED, current through the LED or junction temperature of the LED tell me something about the decreasing light intensity?

Thank you in advance.
Bram

#### ericgibbs

Joined Jan 29, 2010
12,883
hi,
Look thru this PDF, there is a lot of information on the web about this problem.
E

#### Attachments

• 268 KB Views: 12

#### MrChips

Joined Oct 2, 2009
23,500
This could be a complex issue with no clear answer. One way to find out is to conduct an actual test. However, 6 years is a long time to wait for an answer.

If you want to know what becomes of the 30% then you need to measure the efficiency of the LED, that is, how much electrical power is converted into luminous power. I suspect that the problem is not just about efficiency.

Every LED has a characteristic I-V curve that, mathematically, is an exponential function.

The operating point on the curve is at a fixed current and voltage on the curve. This point will always be some point on the rising part of the curve. Since the slope of the curve is large at all points beyond the "knee" point, a tiny change in the operating voltage will result in a significant change in the operating current. For this reason, LED drivers are designed as constant current sources. This keeps the operating current and voltage at a fixed point.

If the operating point is pegged on to the I-V curve, why would the luminous intensity output decrease over the years?

I will guess that at least two things are happening.

Firstly, the I-V curve changes. Since the LED driver is a constant current source, the operating current will remain constant. However, the operating voltage will shift to match the I-V curve.

Secondly, the efficiency will also change.

How the two interact? I do not know.

To accelerate the experiment, one can increase the operating current in an attempt to speed up the aging effect on the LED. Mind you, the effect of increased current vs aging is likely to be a non-linear relationship.

#### dl324

Joined Mar 30, 2015
12,871
Information on the internet tells me that a LED intensity decrease over time. So, L70 at 50.000 hours means that 70 percent of the initial light is emit around 50.000 working hours. 30 Percent is disappeared, but that 30 percent of light energy can’t be gone. Is that 30 percent convert to heat? Is the forward voltage value changed?

So, can the voltage over the LED, current through the LED or junction temperature of the LED tell me something about the decreasing light intensity?
Decreasing luminous intensity is caused by device wearout and LED lifetimes are defined to be when intensity has dropped to 50% of nominal. I've never read anything that indicated that the forward voltage changed.

It'll be interesting if someone finds a scientific reference.

#### ebeowulf17

Joined Aug 12, 2014
3,276
I haven't had time to read the whole thing, but here's a paper which analyzes Vf changes over time, among other things.

https://www.researchgate.net/public...light_emitting_diodes_with_vertical-structure

There are definitely dramatic Vf curve changes in at least one of their scenarios, but I think it may be an extreme case, not within normal working parameters. I'll try to read more later. Here's an interesting graph:

#### Alec_t

Joined Sep 17, 2013
12,060
Decreasing luminous intensity is caused by device wearout
I'd be interested to know why there is wearout. Atomic migration within the semiconductor material? Absorption of environmental impurities? .... ?

#### wayneh

Joined Sep 9, 2010
17,152
So, can the voltage over the LED, current through the LED or junction temperature of the LED tell me something about the decreasing light intensity?
My hunch: No. I don't believe there's anything you can measure, besides the intensity of light produced, that would give you a reliable estimate of LED degradation from aging. I think an old LED just starts to look more like a diode and less like a light-emitting diode. A resistance making heat but no light.

Just a hunch. Would be glad to see the data.

#### ebeowulf17

Joined Aug 12, 2014
3,276
@Alec_t , ifI've understood it correctly (that's a big if,) the paper I linked in post #2 says that microscopic imperfections in the original material, plus time and temperature, result in more and more imperfections. These imperfections all have a larger band gap than that which is needed to create visible light, and so instead of emitting photons when electrons jump the gap, they emit phonons, which seem to loosely translate into heat, I think...

This is all over my head. I honestly thought phonons was a typo when I first read it! Interesting stuff though. Anyway, the paper in post 2 appears to describe the actual mechanisms of LED degradation in detail.

@wayneh, check out the paper I linked in post 7 and let me know what you think of that. I haven't wrapped my head around it yet, but it seems to indicate that I-V curves do change with degradation... at least in some cases... maybe only extreme ones? Like I said, I need to read and re-read it in depth before I expect to have any idea what's going on, but it looks promising.

#### Alec_t

Joined Sep 17, 2013
12,060
The explanation in the post #2 link seems reasonable. Thanks for drawing my attention to it.

#### wayneh

Joined Sep 9, 2010
17,152
@wayneh, check out the paper I linked in post 7 and let me know what you think of that. I haven't wrapped my head around it yet, but it seems to indicate that I-V curves do change with degradation... at least in some cases... maybe only extreme ones? Like I said, I need to read and re-read it in depth before I expect to have any idea what's going on, but it looks promising.
Hmmm... That's a lot to slog through and I think there are indeed some tidbits suggesting you might be able to detect LED aging by having the supply monitor the LED closely enough over time. But I didn't get the feeling it would be easy. You might send a note to the authors and ask them directly what they think of the practicality of the idea.

#### MisterBill2

Joined Jan 23, 2018
8,709
I can sort of wanting to monitor the output of an LED, but the concept of automatic payment and automaticly calling a service company is just plain silly, that is to say, totally ridiculous. If this is a school project UI can see some reason to continue, if not, no reason.

And what changes on an LED that is not damaged is simply that less light is delivered. So really, a light intensity measuring circuit would be required. But probably the monitor circuit will change more than the LED in the long product lifetime. So the proposal sounds much more like a solution in search of a problem.

#### MrChips

Joined Oct 2, 2009
23,500
The research paper in post #7 has the data of my interest.

As the LED ages, the I-V curve shifts to the right. That is, if the LED driver has to maintain a constant current, the forward voltage applied to the LED has to increase. The applied wattage ( I x V) is higher, counteracting the decreasing luminous efficiency that is ongoing as the LED ages.

Thus, to answer the question asked by the TS:

1) If the LED driver is a constant current source, a rising voltage would indicate that the LED is aging. The increasing I x V power may not be sufficient to compensate for the decreasing luminous efficiency.

2) If the LED driver is a constant voltage source, measuring the current ought to give some indication of the aging effects. Raising the voltage to maintain the same current is in effect what a constant current source would achieve.

Increasing the applied power in order to compensate for the falling luminous efficiency will of course accelerate the aging effect. Is this a beneficial method of compensating for falling luminous output? That I do not know.

I believe that the take-away from this is that it would be better to operate the LED at lower power levels if the goal is to prolong the LED's usable lifespan.

#### MisterBill2

Joined Jan 23, 2018
8,709
The summary from Mr. Chips is the same conclusion as comes from many different sources, which is that operation below maximum output will result in longer life of the LED. That makes perfect sense and is similar to the way a great many things work. Output versus lifetime is almost always a trade-off LEDs are no exception..

#### oz93666

Joined Sep 7, 2010
737
Hallo,
I’m working on a project about intelligent lighting. A short introduction: a lamp that monitor itself and ask for help is something goes wrong and will pay the service invoice with blockchain technology.
The best way to approach this is design a lighting system that doesn't go wrong!! ...It is possible ....

this chart shows the effects of overdriving a 20mA led

, but you can see the trend , by under-driving the graph line approaches horizontal ... driving an led at 15% it's rated current will give well over 50years life ... also it is 30% more efficient at these lower currents , (lumens per W input) .... the savings in electricity justify this ...

So for a 100W lamp you will need to spend $14 not$2 for the led chips , a smaller heat sink , and you get a life of 50 years , not 5 years and a saving in power consumption ... it could be designed that if one led chip fails the others stay working ...

The alternative doesn't make sense ...how many hundred \$ does it cost to call out someone to fix an expired led light????

#### MisterBill2

Joined Jan 23, 2018
8,709
Present illumination LEDs do run at a lot more than 20mA but the principle is the same for those LEDs that run at 100mA or 250mA. I am working on a couple of high power arrays that have failed, which have inside 3 sets of 32 LEDs in series across some unknown fairly high voltage. In each string one or more of them has opened up, which shows that they are being pushed quite hard. Since the installation has about 52 of these 48 inch long lights I am attempting to find a way of repairing them. Unfortunately, since they work with 120 volts AC between the ends they do not conform to current safety rules and so they are only available from one source that I am aware of. But the failed lights are an interesting source of very bright surface mount LEDs.
What this shows is that pushing the limit certainly will shorten the lifetime of LEDs and lead to a different failure mode, total failure instead of a lower output. AND I am not trying to hijack this thread.

#### wayneh

Joined Sep 9, 2010
17,152
I seriously wonder if a slight (how large?) increase in voltage from a CC supply, observed years later, could really be used to judge the age status of an LED. Maybe if you logged a datapoint once a month or such. The presence of an increasing voltage trend over time would increase the confidence in the final datapoint. But what if we're talking about only 0.5V or so? Would you really want to make a decision based on a 0.5V change observed over a >5-yr period? Maybe if you incorporate a reliable internal voltage reference.

#### ebeowulf17

Joined Aug 12, 2014
3,276
I seriously wonder if a slight (how large?) increase in voltage from a CC supply, observed years later, could really be used to judge the age status of an LED. Maybe if you logged a datapoint once a month or such. The presence of an increasing voltage trend over time would increase the confidence in the final datapoint. But what if we're talking about only 0.5V or so? Would you really want to make a decision based on a 0.5V change observed over a >5-yr period? Maybe if you incorporate a reliable internal voltage reference.
In one of the graphs, it looks like the current at voltages below the typical Vf range (tunneling current?) changes much more noticeably throughout the life of the LED than the changes in the normal operating range.

I wonder if it would be possible to estimate the life of the LED more accurately by periodically testing it at a specific CC or CV that lands in that region. Looks like that voltage might be around 1.5V, and the current would be... Oh. That's really low current to try measure accurately! Maybe my idea would only work in a lab, not in a light fixture.