Actually, (luminous) efficiency increases at higher currents until current density becomes a factor.Also, the lower the drive current, the more efficiently the LED operates.
Actually, (luminous) efficiency increases at higher currents until current density becomes a factor.Also, the lower the drive current, the more efficiently the LED operates.
Isn't that what people are supposed to do, try to come up with theories about every day things and who ever's theory is not knocked down wins?Dang. Another great theory shot down by reality.
More thought required here...
Curious. I have heard the opposite. Feeding the led with 50:50 pulsed 10mA gives more light than running it at a steady 5mA as the LED is more efficient at the higher current.Also, the lower the drive current, the more efficiently the LED operates.
Most datasheets should have a graph showing relative luminous intensity vs forward current.Feeding the led with 50:50 pulsed 10mA gives more light than running it at a steady 5mA as the LED is more efficient at the higher current.
That graph seems to show efficiency rising all the way, not trailing off?Here's one for a green LED, chosen because it shows that efficiency isn't linear and trails off at higher current:
My bad. It actually shows efficiency continuing to increase to the highest current shown.That graph seems to show efficiency rising all the way, not trailing