IR led driver

Thread Starter

aag

Joined Aug 11, 2018
18
I have purchased a few of these infrared 850nm LEDs. The 20W chip says that Forward Voltage is 16-18V and forward current is 550-700mA. However, if I drive it with 16V the current is quite exactly 1A. If I clamp the current to 500 mA (with a lab power supply) while keeping the voltage at 16V, I do not see any appreciable reduction in IR power (but I could be wrong, I just looked at it through a smartphone camera in the dark).

Does this mean that any current above 500 mA is simply dissipated as heat? If so, should I add current-clamping circuitry to the LED driver? And how would I do that? Sorry for the N00b question, my field is molecular biology, I don't know much about circuits :)

1603624417297.png
 

AlbertHall

Joined Jun 4, 2014
12,346
If I clamp the current to 500 mA (with a lab power supply) while keeping the voltage at 16V,
When the current was limited what did the voltage drop to?
You seem to be saying that when the current was reduced the voltage stayed the same. This is most unlikely. The supply setting may be the same, but the actual output voltage would have dropped.
 

Thread Starter

aag

Joined Aug 11, 2018
18
ahhh, you are of course right. Let me repeat the measurement right away but with a voltage meter in-between...
 

jpanhalt

Joined Jan 18, 2008
11,087
The LED requires a current limiting resistor. That is what you PS is doing. Over-driving the LED without the current limit will not produce a lot more light, but it will reduce the life of the LED.
 

bertus

Joined Apr 5, 2008
22,277
Hello,

It can also be that the camara is saturated and can not see the difference anymore.
As said, leds are current driven elements and driving them with a constant voltage might lead to thermal runaway and distroy the led.

Bertus
 

Thread Starter

aag

Joined Aug 11, 2018
18
thank you all for your immediate and helpful advice! I have repeated the measurement and, as you predicted, the voltage drops to 14V when clamping the current to 0.5A. I suppose that I should add a resistor then. Can the specs of the resistor be calculated from the Forward Voltage (16-18V) and Forward Current (550-700mA) indicated by the supplier?
 

jpanhalt

Joined Jan 18, 2008
11,087
thank you all for your immediate and helpful advice! I have repeated the measurement and, as you predicted, the voltage drops to 14V when clamping the current to 0.5A. I suppose that I should add a resistor then. Can the specs of the resistor be calculated from the Forward Voltage (16-18V) and Forward Current (550-700mA) indicated by the supplier?
Yes, the resistor can be calculated it you also know what the driving/applied voltage (Vd) will be. Let Vf = forward voltage and A = desired current.

R =(Vd - Vf)/A

I usually use a larger resistor than calculated, unless you want maximum brightness.

EDIT: Your power supply has already provided that answer. Its output went ot 14V in order not to exceed 500 mA. So a 2V change at 500 mA was needed. That is 2 V/0.5 A = 4 Ω . Remember that the heat created by a resistor is I^2 x R = 0.5^2 x 2 = 0.5 W. The resistance of your source is in series with that resistor, but you don't know what that is, so start with a 4 Ω 1 W resistor. Resistors operated at their full wattage rating will get hot.
 
Last edited:

Thread Starter

aag

Joined Aug 11, 2018
18
Thank you for your kind answer! But can't I just deduce the resistance of the load by measuring the current at a range of given voltages?

Also, since I know that 14V will give me more or less the correct current, does anything speak against just driving the load with a 14 V source and no resistor? After all if the resistance of the load goes down dramatically for some reason, a 4 ohm resistor will not make much of a difference, right?
 

ElectricSpidey

Joined Dec 2, 2017
2,779
The main output from those LEDs is invisible, the light you can see might not be changing much, but I guarantee if you double the current the light output will increase.
 

Thread Starter

aag

Joined Aug 11, 2018
18
Regrettably I don't have a datasheet, I cannot not even find out the manufacturer and model # of the LED! What I did, however, is to measure current (and inferred resistance) as a function of voltage. Here is the result:

1603648556057.png
 

jpanhalt

Joined Jan 18, 2008
11,087
Look at the blue line (current). That is non-ohmic behavior. That is, it does not fit Ohm's Law of E= IR. A small increase in voltage (12 V to 16 V = about 33% increase in voltage) causes a huge increase in current.

A resistor is ohmic. That is why you need a resistor as current control rather than just reducing the source voltage.
 
Top