Minimum current to power 1W LED

Thread Starter

Bhope691

Joined Oct 24, 2016
38
Hi,

I am looking to find the minimum current that will light a 1W star LED (https://www.kitronik.co.uk/blog/how-to-use-1w-star-led/)

The data says that it draws a minimum of 310mA, but when I connect it to a variable bench top DC power supply set at 3V and 0A the LED lights up (not super bright but noticeable). The Voltage reading on the power supply says 2.5V and the current reads 0A. (The current is displayed to 3 decimal places - 1mA is the minimum).
When I measure the current with an ammeter it is around 500uA.

The data sheet (http://www.farnell.com/datasheets/1636581.pdf) doesn't show the current at 2.5V.

Why is the LED lighting up with a lot less current than the stated minimum draw and how would I find the absolute minimum current required to light the LED?
 

bertus

Joined Apr 5, 2008
22,885
Hello,

Did you see these graphs in the posted datasheet?

1watt_led_current_vs_luminance.png

At lower current the voltage drop will be lower and the luminance will be lower.

Bertus
 

Sensacell

Joined Jun 19, 2012
3,768
LED's are current driven devices, almost any current flow will yield SOME light output.
There is no "minimum" current spec on the data sheet because it would have no meaning.

I am often surprised by how brightly some of these modern LED's can glow from even minute leakage currents.

Also note that the Vf is specified at "normal" operating current levels- it can be lower outside the normal range- and they don't bother drawing the curves that far down.
 

DickCappels

Joined Aug 21, 2008
10,661
I find that the 1 W LEDs I have around here glow noticeably with just a few hundred microamps running through them, which tends to support the lumienance vs current curve shown in post #2 right through the origin.
 

OBW0549

Joined Mar 2, 2015
3,566
Why is the LED lighting up with a lot less current than the stated minimum draw and how would I find the absolute minimum current required to light the LED?
There is no absolute minimum current required to make an LED emit light; using a sensitive photodetector, I've detected light output from LEDs (green "ultrabright" indicator lamps) driven by currents as low as one nanoampere. They probably emit at even lower currents, but that's where my photodetector bottomed out.

As to how much current is needed for the LED to emit light that's visible, that's another story: your eyes have a threshold below which they cannot sense anything, and that threshold varies with dark adaptation and also with ambient illumination. You'll just have to experiment to find the LED voltage and current that gives you the result you want.

BTW, driving LEDs straight from a regulated voltage source is not a good idea; LEDs emit light proportional to the current flowing through them, and the relationship between LED voltage and current is highly dependent on temperature and varies considerably from unit to unit. For best results (and to avoid possibly damaging the LED), use a supply that regulates current rather than voltage.
 

MisterBill2

Joined Jan 23, 2018
27,186
The currents published in the specifications are what the manufacturer will guarantee for operation at the claimed performance levels. Outside of those spec sheet levels there is no promise of any particular performance. So yes, the LED will deliver some light at a much lower voltage and current, but it will not be delivering the promised amount of luminance. OBW is quite right.
 

Thread Starter

Bhope691

Joined Oct 24, 2016
38
Thanks for the above, and with all that in mind I did some further tests and found that one LED wouldn't light with less than 0.2A and 2.45V, through a bench top supply.

However when I attach it to a rectified AC source (no smoothing capacitor) it lights up, brighter than when it is attached to the bench top supply, but the voltage drop across the LED is only 7mV and the measured current is around 1 micro amp. (I tried two different multimeters and they were both very similar values)

Why does it light up with the rectified AC source at such a lower voltage and current than with the bench top supply? Are both my multimeters wrong and can I assume more than 0.2A and 2.45V is being put through them?
 

mcgyvr

Joined Oct 15, 2009
5,394
The forward voltage should be in the 2.9-3.9V range per the datasheet.. anything outside of that is really a faulty diode or measurement error..
You are setting your bench top supply to constant current mode right?

What is the nature of this rectified AC source?
What are you doing to limit current when using it?

I'm not sure if you understand that an LED behaves like a diode as it should and not like an incandescent bulb or other resistive load..
 

Thread Starter

Bhope691

Joined Oct 24, 2016
38
Yeah the bench top supply is constant current. It's not particularly bright at that voltage but visible. Below that voltage or current the light is not visible.

The rectified source is a simple homemade AC generator. No current limiter.

I'm confused to why the multi-meter is showing such a small voltage drop across the LED and such a small current even though it is lighting, I would assume if the readings are correct the LED wouldn't light.
 

MisterBill2

Joined Jan 23, 2018
27,186
The reason that the LED appears to light up with a lower rectified AC voltage is that you are not reading the peak volts, but instead probably the average voltage, which is much less. And if you have a half-wave rectified voltage, FAR less than the peak voltage. And because of the way vision works you see just the peak intensity. It may even be possible to damage the LED if the peak current is excessive. On the other side, that is a way to get more brightness while staying within the maximum power ratings. What you need now is a means to read peak voltage orpeak current.
 

Thread Starter

Bhope691

Joined Oct 24, 2016
38
I'm using a full bridge rectifier and a true RMS multimeter. So the voltage I am reading is 7mV which is the average and as Les mentions the Peak is 7 * 1.414 = still only 9.9mV.

So it is possible that there are larger voltage and current peaks than the 9.9mV and this is what I need to find a means of measuring? They are occurring fast enough to make the LED flicker slightly.
 

kubeek

Joined Sep 20, 2005
5,796
... or something else, because no LED will produce any light at 7mV.
Set your power supply to 3V and 1mA, then measure the actual voltage at the LED. Then increase the current and keep measuring voltage. Notice when it actually starts to be lit. If you don´t trust the in-built ampere meter, then use another dmm to measure the current.
 

MisterBill2

Joined Jan 23, 2018
27,186
It is certainly true that measuring the peak voltage and current are challenging, and so up comes a question as to why do it? Select a reasonable and convenient supply voltage and then a series resistor that makes the LED deliver the amount of light you want while not exceeding th e specified DC forward voltage. Then wait 50,000 hours to see if the brightness drops too much. While exact numbers may be comforting the proof is in the performance..
OR, you can look at the voltage across the LED with an accurately calibrated oscilloscope and gain quite an education. Or use a not so accurate scope and still learn a lot, but without the exact numbers.
 

Colin55

Joined Aug 27, 2015
519
The LED is much more sensitive than your measuring equipment.
A LED will start to glow as soon as the voltage reaches the absolute minimum for the particular chip.
It can be as low as 2.5v for a 1 watt LED and then a current will start to flow.
None of us know the exact minimum for the LED you are using and no-one on the planet knows either.
It is just a phenomenon we have have to accept.
 

DickCappels

Joined Aug 21, 2008
10,661
The LED is much more sensitive than your measuring equipment.
A LED will start to glow as soon as the voltage reaches the absolute minimum for the particular chip.
(Some text removed for clarity)
This brings to mind something that we have not talked about, and that is damaged LEDs, especially those damaged by ESD, ESD can cause an LED to draw current through the damaged section, and that current does not contribute to the generation of light.

Imagine a 1k resistor (as an example) across a red LED that requires 1.8V before any detectable light will be emitted. The LED will "waste" the first 1.8 millimaps just heating up the damaged part of the LED. At a lower current the voltage will not be sufficient for the generation of light.

This model is supported by a statement in a Nichea application note (attached).

When evaluating performance characteristics of LEDs in the product’s application, please check whether the
LEDs have been damaged by ESD. This can be checked by testing forward voltage or lighting it up at low current.
(emphasis added)
 

Attachments

Tonyr1084

Joined Sep 24, 2015
9,744
I'm suspecting you misunderstood what the data sheet was saying:

The LED is a 1W LED and the forward voltage is rated at 3.0V to 3.4V so for the purpose of the calculations a forward voltage of 3.2V is used. The TYPICAL CURRENT of the LED will therefore be 310mA [{ not minimum }] (from Power = Current x Voltage). The power source used with the LED must therefore be able to deliver at least 310mA. • • • The resistor also needs to handle 310mA flowing through it and as a result will need to be a power resistor. Please note that during use both the star LED and the resistor will get hot and shouldn’t be touched.
12 volts at 310 mA translates to 3.72 watts. Even if your LED is "1 Watt", on a 12 volt supply you need to dissipate an additional 3.72 watts. You'd need to use at least a 5 watt power resistor. And like the article says at the end of the part I quoted, they WILL be hot.
 

-live wire-

Joined Dec 22, 2017
959
Just use a 4.5V max 5-450 mA adjustable CC power supply that uses a BJT and potentiometer. This will allow you to get control over the brightness and find the ideal amount of current. In addition, CC will give you ideal life time and less power losses.
 
Top