light emitting diode question

Thread Starter

MrSoftware

Joined Oct 29, 2013
2,202
I want to drive a high output LED from a 14V LiPo battery and I just want to check that I've got the most efficient way to do this.

This is the LED , current is 700mA @ 2.9V

My battery will give me about 14.8V when fully charged

If I use only a current limiting resistor (18 ohm), I will be loosing about 9 watts across the resistor, ouch! But if I instead use this voltage regulator to get 3.3V, then I'll only need about .57 ohm resistance and I'll lose less than 1/2 watt to the resistors. However I see that this regulator can dissipate up to 12W, but I'm unsure how much power the regulator will dissipate under the load I'm proposing.

Will the regulator most likely run cool and be the most efficient way to go, or will I be better with a simple big hot resistor? Or maybe a PWM solution of some sort would be better?
 

ErnieM

Joined Apr 24, 2011
8,377
Pay me now or pay me later but pay it all to me.

Either way you slice it you are still burning off 9W in a resistor, or 1/2W in a resistor and 8 1/2 W in a regulator... so you add more parts to do the same thing.

If you want better efficiency look for a switching regulator: a current regulated output would be ideal, though I don't have anything offhand to offer as a part number (they are ubiquitous across several manufacturers but more complex then just a 3 terminal voltage regulator).
 

Thread Starter

MrSoftware

Joined Oct 29, 2013
2,202
Thank you very much, I will check out the buck current regulators! I knew there must be a way to do this without giving up 1/2 your power to heat! ;)
 

ian field

Joined Oct 27, 2012
6,536
I want to drive a high output LED from a 14V LiPo battery and I just want to check that I've got the most efficient way to do this.

This is the LED , current is 700mA @ 2.9V

My battery will give me about 14.8V when fully charged

If I use only a current limiting resistor (18 ohm), I will be loosing about 9 watts across the resistor, ouch! But if I instead use this voltage regulator to get 3.3V, then I'll only need about .57 ohm resistance and I'll lose less than 1/2 watt to the resistors. However I see that this regulator can dissipate up to 12W, but I'm unsure how much power the regulator will dissipate under the load I'm proposing.

Will the regulator most likely run cool and be the most efficient way to go, or will I be better with a simple big hot resistor? Or maybe a PWM solution of some sort would be better?
As others have pointed out; that's a linear regulator that still dissipates more power than ends up in the load - its an electronically controlled resistance to give tight regulation. Its also worse from the LEDs point of view, as your ballast resistor will now be much lower, the change in Vf as the LED warms up will have more effect on the current draw.

For efficiency you need switch mode, in this case a buck regulator. Best of both worlds might be a 12V (nominal) to 5V switcher, that way you can use a sufficient series ballast resistor with your LED to compensate temperature variations in Vf.

If you can get a 12V in buck with adjustable current output - that would be ideal.
 
Top