LED Drivers - Constant Current vs Constant Voltage

Thread Starter

john2k

Joined Nov 14, 2019
219
I've done quite a bit of reading and still a little confused as to what the difference between constant current and constant voltage drivers are.

I'm working with LED's in a +12v DC car battery environment. Now let's say I had 3 high power LED's (I dont know the wattage of the LED's but I know they are high power because they are on heatsink plates) All 3 LED's are connected in parallel. Now, I've managed to succesfully run this LED setup on a 350mA (1-2W) constant current driver that has a +12vDC input for a long time without problems. But just having a discussion about it with someone has got me wondering why constant current driver and not constant voltage? It's just by chance that I chose the constant current driver.

Would be interesting to know. I've even tried running 3 LED's in series with this same driver and it also works fine. The difference is that if I run in series the voltage output of the driver becomes 10 point somethin volts, whereas if i wire in parallel then it measures about 3.4V. I'm under the impression that in parallel it will deliver the 350mA divided equally across the 3 LED's whereas in series it will deliver 350mA to each LED and increase the voltage.

Just wondering is constant current the correct driver or should I really have gone for constant voltage?

Thanks
 

crutschow

Joined Mar 14, 2008
34,201
LEDs are current devices so need a constant-current source or a resistor in series to limit the current if operated from a voltage source, since they have a very low dynamic (incremental) impedance once their operating voltage is reached.
LEDs are never operated directly from a pure voltage source.
Of course LEDs do drop a voltage (since that how the light power is created) of 2-4V depending upon the type and color.

If you put them in series than obviously they all carry the same current but the voltage drop will be the sum of the individual LEDs' drop.
In parallel operation, manufacturing differences can cause a slight variation in voltage drop, which will cause a difference in the current that each carries. This could cause a noticeable difference in their brightness.
 

Hugh Riddle

Joined Jun 12, 2020
78
I've done quite a bit of reading and still a little confused as to what the difference between constant current and constant voltage drivers are.

I'm working with LED's in a +12v DC car battery environment. Now let's say I had 3 high power LED's (I dont know the wattage of the LED's but I know they are high power because they are on heatsink plates) All 3 LED's are connected in parallel. Now, I've managed to succesfully run this LED setup on a 350mA (1-2W) constant current driver that has a +12vDC input for a long time without problems. But just having a discussion about it with someone has got me wondering why constant current driver and not constant voltage? It's just by chance that I chose the constant current driver.

Would be interesting to know. I've even tried running 3 LED's in series with this same driver and it also works fine. The difference is that if I run in series the voltage output of the driver becomes 10 point somethin volts, whereas if i wire in parallel then it measures about 3.4V. I'm under the impression that in parallel it will deliver the 350mA divided equally across the 3 LED's whereas in series it will deliver 350mA to each LED and increase the voltage.

Just wondering is constant current the correct driver or should I really have gone for constant voltage?

Thanks
Just possibly relevant that I've been using a 20 LED string of white LEDs intended for weddings etc, which are wired in parrallel but each appear to have about 400ohm in series. They deliver high and impressively equal brightnesses on 2.7V supply and consume 50mA in total.
 
Last edited by a moderator:

dendad

Joined Feb 20, 2016
4,451
If the LEDS have series resistors, that is the usual setup for running from a constant voltage supply. the resistors set the current.
But just LEDs by them selves need to run from a constant current supply.
The series resistors produce a reasonably constant current from the constant voltage supply.
Never run an LED without current control of some sort.
 

Audioguru again

Joined Oct 21, 2019
6,647
3 LEDs in parallel fed a constant current look almost as bright as the 3 LEDs in series when each LED has the much higher current
because out vision has a wide range of sensitivity and the iris in each eye adjusts the brightness.

A blinking LED with a pulse duration less than 30ms appears to be dimmed. That is why Pulse Width Modulation can be used as a dimmer.
 

Hugh Riddle

Joined Jun 12, 2020
78
3 LEDs in parallel fed a constant current look almost as bright as the 3 LEDs in series when each LED has the much higher current
because out vision has a wide range of sensitivity and the iris in each eye adjusts the brightness.

A blinking LED with a pulse duration less than 30ms appears to be dimmed. That is why Pulse Width Modulation can be used as a dimmer.
 

Hugh Riddle

Joined Jun 12, 2020
78
The perception of the three parralleled LEDs is interesting. The iris presumably adapts to the brightness of a particular LED when you look closely at it but not when you gaze at the three of them.

I've had interesting surprises while seeking to make the brightness of a chain of LEDs appear to pulse smoothly for a second or two, as when a lighthouse beam sweeps past. For instance, the brightness caused by a triangular LED drive current pulse can appear as a sudden switch-on to quite high brightness followed by a modest rise to a smooth-seeming peak. The drive pulse is time-symmetrical but I perceive the falling as being longer than the rising section. Presumably this is mostly about to the eye trying to handle the fast-changing brightness and sometimes falling short and the brain coming to its own conclusions.
 

Tonyr1084

Joined Sep 24, 2015
7,829
To directly address your question about the difference between constant current and constant voltage - the answer is simply this: In constant current mode the voltage is varied in order to maintain a constant current. In constant voltage mode - the current is varied.

In LED's, lets examine a single LED that is fed from a constant current source - suppose you have a red LED and want to drive it. The LED typically has a forward voltage of 1.8 volts and operates in a range of up to 20 mA. By setting the supply to deliver 20 mA it will provide the correct voltage to light that LED at 20 mA (constantly). Hence, constant current.

On the other hand, you have the same setup, a supply, but this time set for constant voltage. Suppose you set it up to drive 1.8 volts. That's enough to light your LED, but you have no current regulation. That's where a resistor comes in. It's not practical to drive an LED from a 1.8 volt source. However, in this scenario you'd use a 1 ohm resistor. LES's are typically driven with a higher voltage. Remember, we want a current of 20 mA through our LED. So we have to account for all voltage losses. Suppose you use a 5 volt source for your LED. The forward voltage (Vf) is 1.8 volts. 5V (DC) - 1.8Vf = 3.2V and you want 20 mA (0.02 amps): 3.2 volts divided by 20 mA = 160. You would place a 160 ohm resistor in series with the LED. This way your LED is driven at 20 mA as long as your supply (5VDC) remains constant. But if you're using batteries then your voltage drops off. So if your voltage falls to 4 volts you are no longer getting 20 mA. Rather you're getting (4V - 1.8Vf) ÷ 160Ω = 13.75 mA. Since a battery is not a constant voltage source but the load IS a constant load, the current will change according to the available voltage.

That's a lot of words to explain it - sorry, it can be confusing. Simply put, a constant voltage source to drive an LED will require a specific resistor - depending on the voltage available and the desired current through the LED. Whereas a constant current source can drive an LED at a specific current "constantly" because the load (in theory) does not change. In this case the load is the LED.
 
Top