Voltage range in Constant Current LED application

Thread Starter

natdf123

Joined Aug 27, 2023
3
I am an amateur needing some advice. I am powering a constant current LED that is rated 3,82W – 350 mA – 10,95Vf.
My question:
In looking for a CC driver that is 0-10v dimmable, and I understand that 350mA is the required current which I must maintain.
I am only able to find 350mA CC driver that has a higher wattage (higher voltage range) than my LED requires.
Is using a driver rated at 350mA with a higher voltage range (for example 50-90v or 45 Watt) going to harm the LED?
If so, why is there seemingly no device on the general market that drives a cc led at 350mA, 3.8 watts that is also dimmable? (0-10v, analog, etc...)
 

dl324

Joined Mar 30, 2015
16,156
Welcome to AAC!
Is using a driver rated at 350mA with a higher voltage range (for example 50-90v or 45 Watt) going to harm the LED?
Constant current drivers will only provide the voltage required for your LEDs at 350mA. That's the way current sources work. The important thing is to make sure the open circuit voltage of the current source is high enough for your application. In this case 10.95V.
In looking for a CC driver that is 0-10v dimmable, and I understand that 350mA is the required current which I must maintain.
Current sources dim by varying current, or possibly PWMing the output. Whatever is on the output determines what the output voltage will be.
 

DickCappels

Joined Aug 21, 2008
10,104
The voltage rating of the LEDs must me lower than the maximum voltage of the constant current driver and the current rating of the LEDs must be compatible with the output of the constant current driver. Just make sure you have the current right and the voltage will take care of itself.

By multiplying the current by the voltage you have the wattage. Make sure your LED with its heatsink can handle the power and you will be all set.
 

Thread Starter

natdf123

Joined Aug 27, 2023
3
The voltage rating of the LEDs must me lower than the maximum voltage of the constant current driver and the current rating of the LEDs must be compatible with the output of the constant current driver. Just make sure you have the current right and the voltage will take care of itself.

By multiplying the current by the voltage you have the wattage. Make sure your LED with its heatsink can handle the power and you will be all set.
SO the current is set at 350mA by the driver. The voltage however is variable. The driver has a voltage range, in this case 50-90v. But the LED light is designed to be a 3.8 watt unit, and so the Vf value is quite low, at 10.95. So just to re-confirm, you are saying that the voltage range which the driver is supplying (at a fixed current) is irrelevant, so long as the total wattage of the LED is low enough to be covered by that range? (Ie: this driver I am looking at has the capability to supply up to 30 watts, even though I will only require 3.8 watt)
 

dl324

Joined Mar 30, 2015
16,156
So just to re-confirm, you are saying that the voltage range which the driver is supplying (at a fixed current) is irrelevant, so long as the total wattage of the LED is low enough to be covered by that range? (Ie: this driver I am looking at has the capability to supply up to 30 watts, even though I will only require 3.8 watt)
The voltage range of the LED driver is relevant. It has to be at least as high as the voltage required by your device.

The maximum wattage of the driver is less relevant. If the output voltage and current are sufficient for your device, it can be used.
Is using a driver rated at 350mA with a higher voltage range (for example 50-90v or 45 Watt) going to harm the LED?
If a driver rated at 45W and has a maximum output voltage of 90V, then it can provide up to
\( P = IV => I = \frac{P}{V} = \frac{45W}{90V} = 500mA\)
 

DickCappels

Joined Aug 21, 2008
10,104
1693175013202.png
More precicely, I am thinking about this curve. The higher the current the higher the voltage, but for a given LED you will get a certain voltage as a function of current. The actual voltage will wander about a little, mainly as a function of the temperature of the LED, but you can think of the led as a sort of "soft zener diode". Pick a current and that's your voltage. Take care to not get the LED too hot (it does not dissipate too much power based on its heatsinking) and it should be fine.
 

Jon Chandler

Joined Jun 12, 2008
852
You need a 3 watt constant current power supply the operates over the specified forward voltage of the LED.

You probably won't find a 3.82 watt but 3 watt supplies are readily available. The difference in wattage won't significantly affect brightness and will provide a longer LED life.

Here's one of many examples on Amazon.

Screenshot_20230827_152755_Edge.jpg
 

MisterBill2

Joined Jan 23, 2018
16,582
The brightness of an LED is determined by the current. With a 350mA LED, that is the maximum brightness for the average rated lifetime, The current of an LED is determined by the voltage applied, as the graph in post #10 shows.
SO an actual constant current regulator will give a constant intensity.
So for a variable intensity you need an adjustable current regulator controlled by a zero to 10 programming voltage.
Opamp current sources are shown in a number of application notes available for free, and an other opamp to scale the 0 to 10 volt signal to the values to control the current source will not be that hard.
 

LowQCab

Joined Nov 6, 2012
3,584
Keep in mind that controlling the intensity of an LED by varying the Current may
result in changes to the CRI ( Color-Rendering-Index ), and/or, the Color-profile produced.

It's usually better to control the LED's intensity by using a fixed-Current, and a PWM Dimming-Scheme.

Do You want to build your own Circuit ?,
or, are You looking for a commercially available Box ?
.
.
.
 

MisterBill2

Joined Jan 23, 2018
16,582
Certainly PWM is a much more efficient way to control apparent brightness of an LED, the caution is that it may confuse some analog current regulators that have the wrong response times. So if PWM control is used I suggest a regulated voltage source with some resistance to limit the current.
 

BobTPH

Joined Jun 5, 2013
8,108
So just to re-confirm, you are saying that the voltage range which the driver is supplying (at a fixed current) is irrelevant,
No!

A constant current driver driver works by adjusting the voltage until it gets the specified current. Forget the wattage. That spec is only the max wattage that it can provide, and that is the current times the max voltage. In the example you give, 350mA from 50 to 90V, the max wattage is 90 x 0.350 = 31.5W. The minimum is 50 x 0.350 = 17.5W.

The voltage range is the range is the range of voltages it can output while keeping the current at 350mA.

Your example device cannot put out 350mA at the 10.9V that your LED needs. What it does at that point is unknown. Perhaps it just puts out the minimum voltage, 50V, and destroys your LED. A better designed one would shut itself off if it can’t regulate the current.

What you need is a constant current driver that puts out 350mA with a voltage range that includes 10.9V.
 

Thread Starter

natdf123

Joined Aug 27, 2023
3
View attachment 301431
More precicely, I am thinking about this curve. The higher the current the higher the voltage, but for a given LED you will get a certain voltage as a function of current. The actual voltage will wander about a little, mainly as a function of the temperature of the LED, but you can think of the led as a sort of "soft zener diode". Pick a current and that's your voltage. Take care to not get the LED too hot (it does not dissipate too much power based on its heatsinking) and it should be fine.
I see. I appreciate that description. In my case I have ample voltage and therefore wattage and the proper current. So becuase this is a fixed current design, I need not to be worried about "over voltage" as long as my current is set properly and my LED is requiring less volts than the driver provides (less than even the lowest range of its voltage range?). I understand most of this but I am unable to understand why there is a voltage "range" in the driver detail, when it appears that the max voltage it can deliver is more relevant.
 

BobTPH

Joined Jun 5, 2013
8,108
I need not to be worried about "over voltage" as long as my current is set properly
You cannot set both the current and voltage. The current is set by setting the voltage. There is one and only one voltage that will give you 350mA in your LED. I thought that might be your misunderstanding.

The driver we are discussing can ONLY control the current at 350mA if the voltage at that current is between 50V and 90V. And it is not the case for your LED. Why is that so difficult to understand?

If you refuse to believe me (and the manufacturer), go ahead and buy that driver and see what happens. I hope you have a spare LED.
 
Last edited:

dl324

Joined Mar 30, 2015
16,156
I understand most of this but I am unable to understand why there is a voltage "range" in the driver detail, when it appears that the max voltage it can deliver is more relevant.
The person who wrote the specs for the driver is also confused. What matters is the maximum current and the maximum output voltage. Minimum current would also be useful if it can't be set to zero. But, minimum voltage is an extraneous spec.
 

MisterBill2

Joined Jan 23, 2018
16,582
My guess is that the 50 to 90 volts is a supply voltage range, not an output voltage range. If there is any information available about the driver it should be quite clear what it is. But if it is a typical amazon offering the only information is shipping weight and the price. That is why some sources are off of the "OK to use" list.
 
Top