Discussion in 'General Electronics Chat' started by gasket, Feb 2, 2013.

Nov 25, 2012
7
0
i have 2x 10w led running at 10v 1100ma the driver i have is 12v 40w 3a what sort of resistors would i need am i right in thinking 1ohm 20w or would it be 2ohm 20w ??? or somthing else

2. ### Salaja New Member

Jan 27, 2013
23
0
i'm pretty sure for high power LEDs like this, you really want to be driving them with a current source, rather than a voltage source and resistor.

Nov 25, 2012
7
0
????? so i need to make sure i get a driver that has the right ma ???

Nov 25, 2012
7
0
can you explain in stupid terms for me lol i really new to this sort of thing what did you mean by current rather than voltage ?

5. ### thatoneguy AAC Fanatic!

Feb 19, 2009
6,357
718
LEDs "like" constant current drivers, rather than constant voltage. As they heat up, their drop voltage changes, which doesn't matter with a current driver, but does if you are using a voltage source. The hotter it gets, the more current the LED will draw from the voltage source until it burns out.

This is the point of the current limiting resistors in series with the LED, to prevent that from happening. The downside is that the resistor wastes a lot of power as heat, so a constant current driver is more efficient.

Here is one site that sells them There are many others around if you search for LED Driver on google.

6. ### #12 Expert

Nov 30, 2010
16,257
6,759
It all comes down to the fact that a batch of LEDs will have a "range" of operating volts required for each individual LED, the range of voltages changes with temperature, and resistors are only guaranteed to a certain percentage of accuracy. If you have enough extra voltage to "waste" in the resistor, you can guarantee that an LED that lights up somewhere in a range of 9.8 to 10.2 volts will have less than enough current to make them smoke, no matter which exact voltage it uses or exactly what the temperature is today, and be pretty close to the maximum brightness it was designed for.

Your example of 2 volts of, "headroom" is so small that you can't guarantee that kind of accuracy unless you run the LEDs at a lower than maximum rated current. Suppose your LED lights up at 9.8 volts. The resulting current would be 1.1 amps, and that is in the range of a short lifetime for that LED. Let's say that another LED in that batch will light up at 10.2 volts. You will have 1.8 volts wasted in the resistor and the LED will get .9 amps. (These examples are assuming the resistor is exactly 2.00 ohms.) All things considered, an operating range of +/- 10% is considered "good" for an electronic design, but running the LED at .8 to 1 amp would be a more proper answer to refuse to overdrive the LED and shorten its lifetime.

Let's calculate that: 1 amp through a 9.8 volt LED from a 12.00 volt source with a resistor that is exactly what the label says would require 2.20 ohms. Then there is the idea that a rechargable 12 volt battery of the lead acid type will start out at 12.6 volts when fully charged and during charging, the charger voltage might go as high as 14.5 to 16 volts. See how picky it gets? That's why it is difficult to get a straight answer here. The answer is not simple.

7. ### #12 Expert

Nov 30, 2010
16,257
6,759
Continuing...

There is a circuit called a constant current generator. You can design one to allow 1 amp whether the voltage source is 15 volts or 25 volts or the LED is 9.8 volts or 10.2 volts. Still, it's kind of tight trying to make a current generator with less than 2 volts of headroom, and you still have to expect that changing the temperature will change the current. Trying to run an LED with that small of a headroom is a difficult job. You are going to end up with a compromise.

Nov 25, 2012
7
0
thank you for taking the time to reply now i understand thanx again

9. ### takao21203 Distinguished Member

Apr 28, 2012
3,577
463
You can not use a 40W driver for 20W. Nobody wants resistors for power LEDs.