limiting current on high powered led

Thread Starter

h2opolo

Joined Aug 2, 2013
38

shteii01

Joined Feb 19, 2010
4,644
I lack experience for a sure answer.

The led you linked is 30 volts, 100 watt. 100 watt/30 volts=3.33 amperes.
It seems to me that if you have a power supply that can supply 30 volts and 3 amperes, then you pretty much fit that led specification and you don't need a resistor because you are supplying the "exact" current that led needs.

Generally speaking we use resistor to control the amount of current that led will receive. So. If led receives the current it needs directly from power supply, then there is no need for resistor.
 

Thread Starter

h2opolo

Joined Aug 2, 2013
38
I lack experience for a sure answer.

If led receives the current it needs directly from power supply, then there is no need for resistor.
I don't know if I agree with the above statement. Would you mind clarifying that for me? This power supply isn't current regulated; It supplies a constant voltage. The relationship between current and voltage in an led is not linear. As voltage increases, current through the LED increases exponentially. This makes it difficult to regulate the current on an LED simply by adjusting the voltage. A small change in voltage can easily fry the LED. This is why a resister is used in series with an LED. This power supply can exceed the current limitations of the LED so why isn't a resistor used?
 

shteii01

Joined Feb 19, 2010
4,644
I don't know if I agree with the above statement. Would you mind clarifying that for me? This power supply isn't current regulated; It supplies a constant voltage. The relationship between current and voltage in an led is not linear. As voltage increases, current through the LED increases exponentially. This makes it difficult to regulate the current on an LED simply by adjusting the voltage. A small change in voltage can easily fry the LED. This is why a resister is used in series with an LED. This power supply can exceed the current limitations of the LED so why isn't a resistor used?
Eh? Are you sure?
LED is a diode. The way diode works is that it takes a specific voltage to turn it "on", to start to conduct current. There is no point to increase the voltage.

Like I said, I lack the experience, here is what I expect to happen. Once you apply 30 volts to that led, you turn the diode on, then the led will draw whatever current it needs.
 

crutschow

Joined Mar 14, 2008
34,470
Yes, LEDs need some way to control the current, either a resistor in series or a constant-current power supply (the most efficient way).
It would seem to be bad form to connect the power supply you posted directly to the LED you posted since a small change in voltage would likely result in a large change in current.
I wouldn't do that.
 

Thread Starter

h2opolo

Joined Aug 2, 2013
38
Eh? Are you sure?
LED is a diode. The way diode works is that it takes a specific voltage to turn it "on", to start to conduct current. There is no point to increase the voltage.

Like I said, I lack the experience, here is what I expect to happen. Once you apply 30 volts to that led, you turn the diode on, then the led will draw whatever current it needs.
I am sure that as voltage increases across an LED, the current increases exponentially. The large change in current for a small change in voltage makes it difficult to control an LED with voltage. You can easily burn an LED out by only varying voltage slightly which is why we use resistors. I just don't understand why a resistor isn't used in this situation.
 

Thread Starter

h2opolo

Joined Aug 2, 2013
38
Yes, LEDs need some way to control the current, either a resistor in series or a constant-current power supply (the most efficient way).
It would seem to be bad form to connect the power supply you posted directly to the LED you posted since a small change in voltage would likely result in a large change in current.
I wouldn't do that.
Ok that makes sense. It looks like this power supply has a current regulator.

https://www.amazon.com/DROK-Convert...F8&qid=1467182576&sr=1-5&keywords=dc+dc+boost

Do you think the current controlled source is better or a resistor in series?
 

Techno Tronix

Joined Jan 10, 2015
139
I guess there are two reason, First it is wastage of energy because it will convert the electrical energy into heat and second it will reduce the component which can save space.
 

dannyf

Joined Sep 13, 2015
2,197
This may help you understand it.

https://dannyelectronics.wordpress.com/2016/05/24/repurposing-smps-modules-as-led-drivers/

I wrote it to talk about ways to turn a smps power supply as a constant current power led driver. Essentially, you sensebthe current through the LEDs and adjust the smps output voltage to achieve the desired current.

In my case, the current sensor is done via a small resistor.

This notion "you have to have a serial resistor with an led" reflects nothing but a lack of understanding of the underlying reason to have that resistor : to limit current. If you have other ways to limit the current, the resistor is not needed.

You will find that there are far more LEDs running without resistors as with resistors.

Hope it helps.
 

AnalogKid

Joined Aug 1, 2013
11,056
LED is a diode. The way diode works is that it takes a specific voltage to turn it "on", to start to conduct current. There is no point to increase the voltage.
Like I said, I lack the experience, here is what I expect to happen. Once you apply 30 volts to that led, you turn the diode on, then the led will draw whatever current it needs.
Nope. First, while a single LED behaves very much like a single rectifier diode with extra high Vf, a 30 V 3 A device is an array of many diodes, and depending on how it is designed internally, might appear to the outside world as a single diode with Vf = 30 V. If so, then it most certainly does *not* "draw whatever current it needs." It has no current limiting, so just like a rectifier or signal diode, it will burn up if its current is not limited by external circuit conditions. Second, as with all diodes, the voltage across an LED or a large array of LEDs increases as the forward current increases. To reset a common misconception, from start of conduction at low currents to its max rated current, the forward voltage of a 1N4004 rectifier increases by over 100%.

ak
 

merts

Joined Apr 1, 2016
8
To the best of my knowledge the LED bulb has a built in resistor to limit it's current consumption.
A regular LED does not and will conduct all the current that is delivered to it.
If it is more than the rating of the diode it will distruct.
 

shteii01

Joined Feb 19, 2010
4,644
Nope. First, while a single LED behaves very much like a single rectifier diode with extra high Vf, a 30 V 3 A device is an array of many diodes, and depending on how it is designed internally, might appear to the outside world as a single diode with Vf = 30 V. If so, then it most certainly does *not* "draw whatever current it needs." It has no current limiting, so just like a rectifier or signal diode, it will burn up if its current is not limited by external circuit conditions. Second, as with all diodes, the voltage across an LED or a large array of LEDs increases as the forward current increases. To reset a common misconception, from start of conduction at low currents to its max rated current, the forward voltage of a 1N4004 rectifier increases by over 100%.

ak
Thank you. I am educated.
 

Bernard

Joined Aug 7, 2008
5,784
" LED bulb ", post # 12, very likely has a resistor built in & a bunch of other goodies to make a SMPS
constant current driver operating on some standard V like 120 V 60 HZ.
 

eLabElec

Joined Aug 27, 2015
2
If I am looking at the correct schematic from the first link the OP posted, T2 and R2 is a current limit circuit. When enough current flows through R2 to create a Vbe voltage of ~ 0.7V on T2, T2 then pulls the base of T1 to ground and starts to turn off T1. So the current limit is approximately: Ilimit = 0.7V/R2ohms
www.elabelectronics.com
 

apqo1

Joined Oct 5, 2008
52
To answer the OP's question, a resistor is not used because the Instructable uses the two-transistor constant current circuit to regulate current through the LED module. If that circuit were absent, and the same constant-voltage power supply were used, a resistor would be required.

WRT post #14 above, there are no resistors integrated in the LED package. It's an array of 100 nominally 1-watt LED chips arranged as ten parallel strings of ten chips in series. Ten series LEDs at Vf = 3V each accounts for the Vf = 30V for the module. Ten parallel strings drawing 330mA each accounts for the 3.3A total current. If too high a voltage is applied, the module will draw much higher current, rapidly overheat and self-destruct.

The power supply linked in OP's first post is inappropriate for driving LEDs, as it cannot regulate current. The second linked supply, with a constant current control, is a better choice, and will permit experimenting with many different LEDs by adjusting its current output. As stated above, the two-transistor circuit in the linked Instructable is a simple constant current regulator that is necessary to overcome the first power supply's lack of current control. It's unnecessary if a current controlled supply is used.

If you experiment with these high power LEDs, here are a few tips:

1. Protect your eyes. These things are astonishingly bright and looking at them can do permanent retinal damage. Really, don't blow this off. Best case, you'll have spots in your vision for hours or days. Worst case, permanently.

2. Set the power supply to a low constant current before you connect the LED in circuit, then sneak up on the desired operating current. If you mistakenly turn it on with the supply set at 40V 12A, your LED is gone in a bright flash and a puff of smoke.

3. It's not necessary to run these LED modules at their full rated power to get lots of light from them. You'll detect very little difference in light output between half power and full power, but your heat load will be greatly reduced at lower current. I've done several lighting projects using 10W LED strips, and I've always run them at 300-350mA instead of their rated 720mA. The difference in light output is minuscule, and the LEDs run nice and cool with a small piece of 1" x 1/8" aluminum bar as a heatsink.

Speaking of heatsinks...

4. Don't forget heatsinking. LEDs are very efficient compared to earlier lighting technologies, but they still convert the majority of consumed power into heat. A surplus computer CPU heatsink with attached fan is a good choice, but monitor temperature until you're sure it's stable.
 

tonyStewart

Joined May 8, 2012
131
http://www.instructables.com/id/100w-Led/

I have seen many people use this LED and power supply together. I was wondering if someone could explain why a resistor in series with the led is not used. Is the power supply just maxing out? Would a resistor be beneficial?

Here is the LED:
https://www.amazon.com/LOHAS®-White-Power-Energy-Saving/dp/B00CZ75TWA/ref=sr_1_1?ie=UTF8&qid=1467177481&sr=8-1&keywords=100+watt+led+chip

Here is the power supply:
https://www.amazon.com/Geeetech-Con...g_3?ie=UTF8&psc=1&refRID=8P7BX5QY1DKS5RMZPS2X
A Resistor is not necessary , but a great heatsink is critical with 100W in a dollar coin sized chip such as a fan powered CPU cooler or a larger passive cooler. I'll explain where your built-in series resistance comes from.

There is a risk of thermal runaway only when poor heat dissipation causes the temperature to rise. The LED diode has a threshold voltage and bulk resistance I call ESR such that when ON, the junction is saturated and the voltage rise is linear with current well above the threshold current of say 10% of the max current at 25'C

However all diodes have a NTC or -delta Vf vs. + delta T ['C] called the Shockley Effect. This diode array has ALL of the voltage rise due to this bulk ESR so while it gets hotter, the internal junction voltage is dropping slightly but the voltage rise is just the heat loss in the internal diode resistance. IF the junction threshold voltage (like a diode thermometer) drops at a fraction of the rise of the external adjustable CV , then it will be in a stable operating point.

This depends on the thermal resistance in ['C/W] , the NTC in [mV/'C] and the ESR of the LED array Delta V/delta I , which happens to be related to the W max rating of the chips such that if we model it as 10x10W LED arrays in series, I know it will have an ESR of 10x 1/10W = 1 Ω at max power.

If we assume the PSU has a load regulation of 0.5% at 35V rated for 150W , that translates to linear load of 8.2 Ω , so 0.5% gives an effective series R (ESR) of 42 mΩ , which we can neglect , even if it was 1%.

Thus the LED string at 30-34V has enough internal resistance to compensate for a large rise in temperature, say 60-85'C worst case.

If the LED array was 100W at 3V , it would be a problem since the LED ESR would now be 1/100W or 10 mΩ.

So all you need to do is adjust the PSU voltage to reach a stable operating temperature that won't burn your finger quickly. The better you design the heatsink or cooler it's operation, the more power you can drive it with. which may be 100 W with a good heat sink or only 50W with a poor one and maybe 120W with liquid CPU cooling.

Tony Stewart
EE since 1975
 
Last edited:

EM Fields

Joined Jun 8, 2016
583
A Resistor is not necessary , but a great heatsink is critical with 100W in a dollar coin sized chip such as a fan powered CPU cooler or a larger passive cooler. I'll explain where your built-in series resistance comes from.

There is a risk of thermal runaway only when poor heat dissipation causes the temperature to rise. The LED diode has a threshold voltage and bulk resistance I call ESR such that when ON, the junction is saturated and the voltage rise is linear with current well above the threshold current of say 10% of the max current at 25'C

However all diodes have a NTC or -delta Vf vs. + delta T ['C] called the Shockley Effect. This diode array has ALL of the voltage rise due to this bulk ESR so while it gets hotter, the internal junction voltage is dropping slightly but the voltage rise is just the heat loss in the internal diode resistance. IF the junction threshold voltage (like a diode thermometer) drops at a fraction of the rise of the external adjustable CV , then it will be in a stable operating point.

This depends on the thermal resistance in ['C/W] , the NTC in [mV/'C] and the ESR of the LED array Delta V/delta I , which happens to be related to the W max rating of the chips such that if we model it as 10x10W LED arrays in series, I know it will have an ESR of 10x 1/10W = 1 Ω at max power.

If we assume the PSU has a load regulation of 0.5% at 35V rated for 150W , that translates to linear load of 8.2 Ω , so 0.5% gives an effective series R (ESR) of 42 mΩ , which we can neglect , even if it was 1%.

Thus the LED string at 30-34V has enough internal resistance to compensate for a large rise in temperature, say 60-85'C worst case.

If the LED array was 100W at 3V , it would be a problem since the LED ESR would now be 1/100W or 10 mΩ.

So all you need to do is adjust the PSU voltage to reach a stable operating temperature that won't burn your finger quickly. The better you design the heatsink or cooler it's operation, the more power you can drive it with. which may be 100 W with a good heat sink or only 50W with a poor one and maybe 120W with liquid CPU cooling.

Tony Stewart
EE since 1975
So you're advocating using a raw voltage source to drive the array?
 

Plamen

Joined Mar 29, 2015
101
http://www.instructables.com/id/100w-Led/

I have seen many people use this LED and power supply together. I was wondering if someone could explain why a resistor in series with the led is not used. Is the power supply just maxing out? Would a resistor be beneficial?

Here is the LED:
https://www.amazon.com/LOHAS®-White-Power-Energy-Saving/dp/B00CZ75TWA/ref=sr_1_1?ie=UTF8&qid=1467177481&sr=8-1&keywords=100+watt+led+chip

Here is the power supply:
https://www.amazon.com/Geeetech-Con...g_3?ie=UTF8&psc=1&refRID=8P7BX5QY1DKS5RMZPS2X

Petkan:
The LED current increases very sharply with voltage i.e. we need to limit the current. A resistor could do that at fixed input voltage, although there would be still some thermal variation in current. The resistor will eat the balance from input voltage to LED voltage i.e. it is suitable for low power LEDs. The common way is to use a constant current source. Any linear or switching mode voltage regulator is normally set as constant voltage source by simple voltage negative feedback through resistive divider. If you make the feedback current, not voltage - the regulator will became a current source. The simplest way is a small current sense resistor in series with the LED. At higher currents Its voltage could be amplified by current sense amplifier to reduce the losses in the resistor. Regulators will lower internal reference are more suitable for this role.
 
Top