Power LED Specs

Thread Starter

hazim

Joined Jan 3, 2008
435
Hi all.

I've bought 3W power LEDs and one 5W power LED. All are white. I thought that what defines the voltage of the LED is its light color... I've learned that too in the university... So I thought that these LEDs will work well at approx. 3.3V. But no, I used 3V and the LED didn't work, so I increased the voltage till the LED started to give some light at about 5V.
I found this page: http://www.hebeiltd.com.cn/?p=led.power
At the 5W LED section they give to types, 540mA/9.2V and 700mA/6.0V.

Testing mine I found that the LED drains 540mA at 7V, so it's none of above.
What do you recommend me to do? Also for the 3W LEDs?

They are bought without specifications.

Regards,
Hazim
 

bertus

Joined Apr 5, 2008
22,270
Hello,

There are powerleds that are made of more small leds inside the housing.
That way you can have higher voltages than the voltage needed for a single led.
Take a look at the datasheet for more details for the led you are trying to use.

Bertus
 

mcgyvr

Joined Oct 15, 2009
5,394
Any led should be driven with a constant current source. Please do not use a regular constant voltage power supply without some way to ensure the proper current (a resistor or constant current source,etc..)

You must know the specifications of an LED to build a proper circuit. If you don't...Don't use it. Crap in=Crap out
 

Thread Starter

hazim

Joined Jan 3, 2008
435
But I don't have the datasheet and don't know the specifications of the LED. This is the main problem. What to do to find the voltage of the LED?
 

Thread Starter

hazim

Joined Jan 3, 2008
435
Can I do the following, start increasing the voltage accross the LED till the current times the voltage becomes 5W?
 

SgtWookie

Joined Jul 17, 2007
22,230
I'd be willing to bet that the "5Watt" specification was a marketing ploy, and if you try to operate the LED at the claimed wattage, you will burn it up. At 540mA, 7v, you're already at 3.78 Watts. You might be OK there.

You really need to use a constant current driver for LEDs that size. Resistors and linear regulators will simply waste too much power.

In the future, I suggest that you don't buy LEDs without knowing the manufacturer's part number, and have a datasheet available. If the seller won't supply you with a datasheet, they are not reputable.
 

Thread Starter

hazim

Joined Jan 3, 2008
435
I'd be willing to bet that the "5Watt" specification was a marketing ploy, and if you try to operate the LED at the claimed wattage, you will burn it up. At 540mA, 7v, you're already at 3.78 Watts. You might be OK there.
But why? it's a 5W LED, can't I let it consume say 4.5W? By the way these LED type are to be mounted to a heatsink, I used a heatsink but the LED is not going much hot to use a heatsink.. so I think I can increase the voltage/power even more.

You really need to use a constant current driver for LEDs that size. Resistors and linear regulators will simply waste too much power.
I may do that using LM317 current limiting circuit simply.. right?

In the future, I suggest that you don't buy LEDs without knowing the manufacturer's part number, and have a datasheet available. If the seller won't supply you with a datasheet, they are not reputable.
This seller is the biggest one how sell electronic components here in Lebanon, neither he nor others gives datasheets of part numbers for LEDs. Anyway I'm not buying a big quantity to care about that but buying a 5W LED for experimenting and hobby practice is better than not buying it because there is no datasheet... if the LED burned out then this is not the end of the universe :)

Regards,
Hazim
 

iONic

Joined Nov 16, 2007
1,662
A 9V supply and 540mA brings you to 4.86W. A 6V supply and 700mA brings you to 4.90W. To be safe I wouldn't go beyond 4.5W.

BUT, you must regulate the current. An example (and probably the least efficient) would be to use a 12V supply with a 6.8 ohm 2W - 3W resistor and the LED attached to a suitable heatsink. This LED will have a fairly constant current of 500mA and consume 4.5W.

The other option is to use a 9V supply with a 4.7 ohm 3W resistor an the LED attached to a suitable heatsink. This LED will have a fairly constant current of 700mA and consume 4.2W.

Without current regulation your LED's will live a short life.
 

Thread Starter

hazim

Joined Jan 3, 2008
435
A 9V supply and 540mA brings you to 4.86W. A 6V supply and 700mA brings you to 4.90W. To be safe I wouldn't go beyond 4.5W.
As I know, a load draws current as much as it requires at it's rated voltage. As I increase the voltage the load (LED) takes more current and so at a specified voltage which is its rated voltage the LED will be draining it's rated current and so it's rated power consumption. For example a 12V 12W light bulb drains 1A of current. as I increase the voltage towards 12V the current will increase towards 1A. At 12V the bulb will be draining 1A... I know when using several LEDs in series then a resistor is a must and I understand why that, but I don't understand why to use a resistor or current regulating for a LED when I'm powering it at the rated voltage directly from the source.

I have an explanation for that but not sure from it. As the LED in ON for a long time the silicon material starts to become bad, i.e the LED life starts to collapse. This makes the LED consumes some more current and this makes the LED's life even shorter/like a spiral.. The resistor's rule is to limits that increase in current.

Regards,
Hazim
 

Wendy

Joined Mar 24, 2008
23,415
The thermal shift if Vf of an LED is real, but it is exaggerated by many folks. If a LED with 3.6 Vf drops to 3.4Vf the current is simply not going to increase that much.

Lets use an example. A 3.6VF white LED that draws 700ma is a 2.5W LED. If I were to use a regulated 6V power supply the resistor needed to provide 700ma is a 3.43Ω resistor. If the LED shifts down to 3.3V (because it gets hot) then the current will increase to 787ma. It isn't that bad, and with higher power supply voltages the shift is even less. For example, a 9V power supply feeding the same LED would need a 7.714Ω resistor. If the LED drops down to 3.3V then the current would go up to .739A

Constant current supplies overcome this nicely, and there is a lot of ways to do this.

The thing that causes thermal runaway is heat. Power LEDs get hot, a heat sink is mandatory when using them anyhow. The cooler you keep them, the longer they last.

There is another thread you ought to pay attention too, where the OP was talking about using a linear regulator somewhat like you were. Linear regulators get hot all by themselves, and are pretty wasteful with battery juice.

A SMPS regulator converts instead of wasting the extra energy. A 9V power supply feeding a 3.6V LED at .7A may only have to provide .3A (and the LED will still get its .7A). They run much cooler too.

700mA LED driver, LM317 or FET ?

This has been an ongoing subject on AAC for quite a while, it looks like another cycle is beginning. If this looks too complex to build you can buy an off the shelf version. Google "buck puck".
 

mcgyvr

Joined Oct 15, 2009
5,394
As I know, a load draws current as much as it requires at it's rated voltage.
Hazim
This is true for a resistive load but an LED needs to be fed the correct amount of current or you will destroy it. As already stated you should NEVER just feed an LED straight from a regular constant voltage power supply without any form of current limiting (like a series resistor/buckpuck,etc...)

In very basic terms an LED doesn't care what voltage is being delivered to it. But it DOES care what current is being fed to it.
For example. I often use a 48V supply to power an LED.. BUT I have the correct resistor in series so that the LED is only fed 20mA. The Vf of an LED is not its "rated voltage" it is the voltage drop that a circuit will experience when an LED is used. So if your supply is 100V and your LED has a Vf of 3V then your actual circuit voltage is 97V
 
Top