LED Driver voltage question

Thread Starter

steveparrott

Joined Feb 14, 2006
36
Trying to understand an aspect of LED technology. There are many bright LED lamps on the market that incorporate the optics, chips, board, driver and heat sink into a single module. I have a question about the driver.

My understanding about how drivers work is that they convert a current source (say, 12 volts AC) into the proper forward voltage and current (amps) according to the LED specs. These drivers also regulate the current so (according to mfgs.) the source voltage can be anywhere between 9 volts and 15 volts.

It seems to me that any excess voltage (over the nominal 12v rating) must be converted to heat. Heat is a big issue for these newer bright white LED's - this is especially important in outdoor lighting where designers may daisy-chain the fixtures, having 15v at the first fixture and 9v at the last one - if the 15v fixture burns hotter then it will lose brightness earlier leading to an eventual poor display of mixed brightness lighting.

Am I correct in thinking that sourcing an LED lamp at 15v will put more heat into the module compared to a 12v source?

And, if that's true can anyone direct me to a formula where I might predict the amount of excess heat produced? Even a ballpark percentage would be useful.

Thanks.
 

SgtWookie

Joined Jul 17, 2007
22,230
Hi Steve
My understanding about how drivers work is that they convert a current source (say, 12 volts AC) into the proper forward voltage and current (amps) according to the LED specs. These drivers also regulate the current so (according to mfgs.) the source voltage can be anywhere between 9 volts and 15 volts.

It seems to me that any excess voltage (over the nominal 12v rating) must be converted to heat. Heat is a big issue for these newer bright white LED's - this is especially important in outdoor lighting where designers may daisy-chain the fixtures, having 15v at the first fixture and 9v at the last one - if the 15v fixture burns hotter then it will lose brightness earlier leading to an eventual poor display of mixed brightness lighting.
LEDs need to have their current controlled. A manufacturer of LEDs will generally give a maximum continuous current specification, along with a typical Vf (forward voltage) and a maximum Vf. The "typical Vf" is an average of a statistical sample. Your mileage will vary. I have found Vf's of LEDs (even in the same lot) to have varied by over 10% when supplied with the same amount of current. That is why you should never attempt to wire LEDs in parallel without a current limiting resistor. The Vf of an LED changes very little over a broad current range; much like a Zener diode.

Am I correct in thinking that sourcing an LED lamp at 15v will put more heat into the module compared to a 12v source?
It depends upon the technology used in the module. If it is a "buck" converter, it will be very efficient (as high as 97%) and very little heat will be generated, nor power lost due to this heating. If it is an older "linear" style regulator or a simple fixed resistor, the power will be dissipated as heat.

And, if that's true can anyone direct me to a formula where I might predict the amount of excess heat produced? Even a ballpark percentage would be useful.
If it is a buck DC-DC converter/regulator, you would have to consult the documentation for the particular unit.

If it is a linear regulator or a simple fixed resistor, just use Ohm's Law.
P = (Vsupply - Vftot(LED)) x I(LEDsupply)
where:
P = power in Watts
Vsupply = the DC voltage that is being supplied to the regulator. If it's VAC, then divide the VAC by .707107 before subtracting Vftot(LED)
Vftot(LED) = The total of the forward voltages of all LEDs in the string.
I(LEDsupply) = The current being supplied to the LEDs.
 

Thread Starter

steveparrott

Joined Feb 14, 2006
36
Thanks for the info. I went to those links and learned a lot.

From looking at driver efficiency charts based on supply voltage (http://www.maxim-ic.com/appnotes.cfm/appnote_number/3532), it seems there could be a loss of efficiency (when comparing a 15 volt supply to a 9 volt supply) of about 4%. (Using an inductive buck-boost type).

Then looking at a lumileds junction temp vs. lifetime chart (http://www.lumileds.com/technology/lumenmaintenance.cfm), if I assume that this 4% loss is roughly correlated to a 4% heat gain then I can draw the following guess: If the LED junction temp is well below the critical value then the added 4% heat would have little or no effect on the lifetime; if, however, the junction temp is at or above the critical value then the added 4% would result in about a 30% drop in lifetime.

Am I making any seriously wrong assumptions in my rough analysis?
 

SgtWookie

Joined Jul 17, 2007
22,230
No, now you're talking about two different devices.
The 4% efficiency loss is due to the inductive buck/boost DC-DC converter. 96% eff. is pretty darn good. That 4% loss will be radiated as heat from inside the buck/boost DC-DC converter, not the LEDs. Unless the LEDs and the buck/boost converter are in the same enclosure, it will have no effect on the LEDs.
 

Thread Starter

steveparrott

Joined Feb 14, 2006
36
Thanks, but that's the whole concern, the driver is in the same enclosure as the LED. These are very compact modules and manufacturers are pushing the currents to the limit so they can achieve lumens equivalent to tungsten halogen lamps.
 

Dipyaman

Joined Apr 11, 2012
1
I have 2 nos of 5M LED strips . They are SMD3528 LED strip having 60 LEDs/M driven by 12V DC with 0.08A on each LED.
I want to connect both strips in series with a DC Power supply. Now my question is :

1. Can i use 24V DC supply for this 12V strip(2x5M in series) and still get proper result without damaging the strip?

2. Can I use a 12V DC supply for this 10M strip(2x5M in series) and still get proper brightness from each LEDs on it?

Many thanks for the guidance already

Dipyaman
 
Top