Hi,
I am looking to find the minimum current that will light a 1W star LED (https://www.kitronik.co.uk/blog/how-to-use-1w-star-led/)
The data says that it draws a minimum of 310mA, but when I connect it to a variable bench top DC power supply set at 3V and 0A the LED lights up (not super bright but noticeable). The Voltage reading on the power supply says 2.5V and the current reads 0A. (The current is displayed to 3 decimal places - 1mA is the minimum).
When I measure the current with an ammeter it is around 500uA.
The data sheet (http://www.farnell.com/datasheets/1636581.pdf) doesn't show the current at 2.5V.
Why is the LED lighting up with a lot less current than the stated minimum draw and how would I find the absolute minimum current required to light the LED?
I am looking to find the minimum current that will light a 1W star LED (https://www.kitronik.co.uk/blog/how-to-use-1w-star-led/)
The data says that it draws a minimum of 310mA, but when I connect it to a variable bench top DC power supply set at 3V and 0A the LED lights up (not super bright but noticeable). The Voltage reading on the power supply says 2.5V and the current reads 0A. (The current is displayed to 3 decimal places - 1mA is the minimum).
When I measure the current with an ammeter it is around 500uA.
The data sheet (http://www.farnell.com/datasheets/1636581.pdf) doesn't show the current at 2.5V.
Why is the LED lighting up with a lot less current than the stated minimum draw and how would I find the absolute minimum current required to light the LED?
