How does the led driver figure it out..?

Thread Starter

thedoc8

Joined Nov 28, 2012
162
Say I have a led driver with a dc output of 25 to 90 volts and 300ma. How does this driver know which output voltage to use on a given string of leds.
 

Ian0

Joined Aug 7, 2020
9,677
It switches the output on. The current ramps up (because it is going through an inductor). When it reaches 400mA it switches off and the current ramps down. When it reaches 200mA it switches on again.
It doesn't even know what the voltage is.
 

Thread Starter

thedoc8

Joined Nov 28, 2012
162
It raises the output voltage until it senses the output is at 300mA.

There is probably a million ways to do that...well maybe less. :)
That don't work out, push 300ma into 18 led's, now switch to 9 leds and the driver gets the correct voltage at half the ma. I thought the same, run the voltage up until 300ma was pulled and quit. I got to do some more simple test, this is to easy.
 

Ian0

Joined Aug 7, 2020
9,677
That don't work out, push 300ma into 18 led's, now switch to 9 leds and the driver gets the correct voltage at half the ma. I thought the same, run the voltage up until 300ma was pulled and quit. I got to do some more simple test, this is to easy.
18 LEDs is about 54V, 9 LEDs is 27V. The correct voltage is between 25V and 90V so they are both the "correct" voltage.
Switching from 18 LEDs to 9 LEDs just halves the voltage and the current remains at 300mA
 

Thread Starter

thedoc8

Joined Nov 28, 2012
162
I have a universal led driver, no mater what string of led's I connect to, it sets itself to the correct voltage and ma draw. Just trying to figure out how
 

AnalogKid

Joined Aug 1, 2013
10,987
A "standard" power supply maintains a constant voltage no matter (within its ratings) how much or how little current is drawn from it by the load. If a supply is rated for 12 V at 2 A, it will keep its output at 12 V if the load is 0.1 A, 0.5 A, 1 A, 1.999 A ... This is a "constant voltage" supply. Internally, it has a feedback loop from the output to the control electronics. The electronics adjust the drive signals to the output stage, turning them up if the output sags below 12 V, and turning them down if the output is above 12 V. This is what most people are familiar with. It is why a USB charger can charge anything from a little headset to a tablet pc. 5 V is 5 V.

A "constant current" supply works the exact same way, only the regulated property is the output current, not its voltage. It senses the output current, and adjusts the output voltage (with an almost identical feedback loop) to maintain it. So as the number of LEDs increases, so does the output voltage; whatever it takes to maintain 300 mA. Your supply has a maximum compliance of 90 V. If the load requires more than 90 V to have 300 mA through it, the supply output will max out at 90 V and then the current will begin to decrease.

The core circuit concept in both cases is an amplifier with negative feedback. In effect it looks what it is being told to produce, looks at the output to see if in fact that is happening, and if not, adjust itself to bring things into agreement.

https://en.wikipedia.org/wiki/Negative-feedback_amplifier

ak
 

Ya’akov

Joined Jan 27, 2019
9,072
I think the TS question is: How can the driver provide the correct current for 1, 10, and 100 LEDs without prior knowledge of the number of LEDs connected? He is saying that his driver works no matter how many LEDs are connected.
 

Ya’akov

Joined Jan 27, 2019
9,072
As long as the LED string has a operating voltage in the range of "25 to 90 volts" it will work.
I have built drivers that will produce "300mA" down to 0V.
The problem is, if the LEDs need 30mA each, then 10 will work properly and 1 will let out the magic smoke.
 
Top