how to determine needed voltage of power supply

Thread Starter

salty joe

Joined Dec 14, 2010
43
I want to run a string of eight 1W LEDs using a Meanwell LDD-350 driver.
The LDD-350 has input voltage of 9-36V, output voltage of 2-32V and max output current of 350mA.
The LEDs have forward voltage of 2-2.2V and forward current of 280-350mA.
For power consumption, I multiplied 2.2V times 350mA to get 0.77W, then multiplied by eight and added 20% to get 7.4W.

Assuming I only need at least 7.4W, how do I determine the proper voltage for the power supply?
 

KeithWalker

Joined Jul 10, 2017
3,063
How are you planning on controlling the current to the LEDs? The required supply voltage will be the voltage drop across the current limiting circuit plus the maximum possible voltage drop acrss the LEDs (8 x 2.2V = 17.6V).
Regards,
Keith
 

Audioguru again

Joined Oct 21, 2019
6,673
The sales sheet for the Meanwell LDD-350 says that it has a constant current output of 350mA.
The LEDs might have a maximum do not exceed current rating of 350mA when they die if they are not cooled properly.
Eight 2.2V LEDs in series need 17.6V which can easily be supplied by the Meanwell LDD-350.
The input voltage needs to be at least 4V higher then the output voltage so 17.6V + 4V= 21.6V minimum up to the maximum allowed input of 36V.
 
24 VDC is a common power supply. There's an efficiency curve. https://www.trcelectronics.com/mean-well-din-rail-power-supply-hdr-15 hdr-15-24 would work. The current really doesn't matter as long as it coves the efficiency. 0.63A would be OK. So would a 10 A supply. You might be able to use the 24 VDC for something else. Efficiency is like >90%, so anything greater than > 0.350 A * 1/0.90 or >0.388 A. 0.63A and 1A satisfy that.
 
Top