Current generator as driver for LED

Thread Starter

xxxyyyba

Joined Aug 7, 2012
289
Hello.
I have current generator with output current 200mA and voltage range 22V-43V, 10W power. I have LED SMD chip, 10W power, current 800mA-900mA, voltage 9V-12V. Can I use this current generator to drive this LED, I mean is it possible to add some external circuitry to this current generator to transform it to appropriate voltage source so I can drive this LED? Any idea? Best regards
 

Tonyr1084

Joined Sep 24, 2015
7,905
LED's are current driven devices. But we need to know more about your "generator". Is it an AC or DC source? Is it adjustable? Is it adjustable voltage or adjustable current? Or both?

You can drive an LED at any voltage above its forward voltage rating. You can drive it from 1000 volts AC or DC. For the moment let's assume it's DC. You would need a resistor to waste the excess current so that the LED is not overpowered. Assuming 900 mA current, you would need a resistor value between 1.1KΩ and 1.2KΩ which would give you 833mA to 909 mA of current through your LED. At my example, using 1000 volts running 900 mA through an LED you would need a 900 watt resistor. Highly impractical, however, the point being made is that your supply SHOULD be able to drive your LED. However, as I opened, we need more information about your power supply and your LED.
 

BobaMosfet

Joined Jul 1, 2009
2,113
Thanks for reply. Current generator is DC. Input to generator is mains voltage and output is DC current. It is not adjustable. There are parameters on it: 200mA DC, 22V-43V DC, 10W. Here is SMD LED chip:

https://www.aliexpress.com/item/10p...pm=a2g0s.13010208.99999999.259.6f033c00fPTwCi
I don't know why people can't answer your question simply. The answer is NO.

Your LED, you say requires 800-900mA.
Your generator (DC) outputs max 200mA.

So no, your generator is not powerful enough to power the LED.
 

Tonyr1084

Joined Sep 24, 2015
7,905
I guess I failed to ask if it's a constant current power source. If it is constant current and it is capable of 200 mA then I don't think this supply will power a 900 mA chip LED.

[edit] looks like Boba beat me to it.
 

Uilnaydar

Joined Jan 30, 2008
118
I don't know why people can't answer your question simply. The answer is NO.

Your LED, you say requires 800-900mA.
Your generator (DC) outputs max 200mA.

So no, your generator is not powerful enough to power the LED.
Because if we say "no", they just go ask mother and she'll say yes.
 

Thread Starter

xxxyyyba

Joined Aug 7, 2012
289
Thanks for reply. It is obvious that generator current is smaller than current required for LED, but it has same power rating like LED, right? I know that I can't connect directly current generator to LED but maybe I can add some circuitry between current generator and LED to enable current generator to drive LED?
 
Last edited:

Norfindel

Joined Mar 6, 2008
326
Just look at the numbers:

PSU = Constant current (you sure about that?) 200mA. Voltage range 22V-43V
LED = current 800mA-900mA, voltage 9V-12V

It should be fairly obvious that:
  • the current provided by the PSU is too low
  • the minimum voltage provided by the PSU is too high
Power is just voltage multiplied by current. But the fact that the PSU has the same power than the LED doesn't means that they can be used together. All the specs should be compatible. In the case of the led, the best is to have a constant current source of exactly the same current rating than the LED, and the voltage range of the PSU has to include the forward voltage drop of the LED (Vf), with some room to spare (in fact, the voltage range of the PSU should be lower than the minimum Vf listed on the LED's datasheet, and higher than the maximum Vf).
Another way is to use a constant voltage PSU with a voltage larger than the LED's maximum Vf, and calculate the appropiate resistor to use in series with the LED, so that the current thru the LED is the required current.
 

Thread Starter

xxxyyyba

Joined Aug 7, 2012
289
Just look at the numbers:

PSU = Constant current (you sure about that?) 200mA. Voltage range 22V-43V
LED = current 800mA-900mA, voltage 9V-12V

It should be fairly obvious that:
  • the current provided by the PSU is too low
  • the minimum voltage provided by the PSU is too high
Power is just voltage multiplied by current. But the fact that the PSU has the same power than the LED doesn't means that they can be used together. All the specs should be compatible. In the case of the led, the best is to have a constant current source of exactly the same current rating than the LED, and the voltage range of the PSU has to include the forward voltage drop of the LED (Vf), with some room to spare (in fact, the voltage range of the PSU should be lower than the minimum Vf listed on the LED's datasheet, and higher than the maximum Vf).
Another way is to use a constant voltage PSU with a voltage larger than the LED's maximum Vf, and calculate the appropiate resistor to use in series with the LED, so that the current thru the LED is the required current.
Thanks for reply. Here are my thoughts.
Ideal constant current source will generate some constant current which would be completely independent of load to which it is connected. Voltage on current generator terminals depends only on connected load.
Practical constant current sources give constant current for some range of load values. If we are given power rating for practical source, 10W in this case and current rating, 200mA in this case, it means that maximum voltage on generators terminals is U=P/I=10W/200mA=50V. On the other side, we have load which needs 900mA to function properly and when it consumes 900mA, it disipates 10W of power. Because current value required for LED doesn't match with value of current generator, it is obvious that this current generator cannot alone drive LED, but because of appropriate power it can deliver to LED, I would say that with some additional circuitry between current generator and LED, it would be possible to drive LED. I mean no physical law would be violated :)
 

Tonyr1084

Joined Sep 24, 2015
7,905
To reduce the voltage and up the amperage you'd need a transformer. You can't do that with a DC source.

Now - you COULD build an inverter so you can drop that voltage while increasing the amperage. Highly impractical. Cheaper, faster and more reliable to get the appropriate supply.

Remember - WATTS is WATTS. 43 volts at 10 watts is 232 mA. 22 volts at 10 watts is 455 mA. 12 volts at 10 watts is 833 mA. 9 volts at 10 watts is 1100 mA (rounded). So if your supply is pushing 43 volts at 10 watts - assuming it is outputting AC then you can use a transformer to drop the voltage and up the amperage. In a perfect world you'll put 10 watts into a transformer you'll get 10 watts out. In the real world you have losses, depending on the transformer. Since this is a DC source, by the time you invert the output to AC and transform the voltage down to 9 volts, you're going to lose a whole lot of wattage; and it remains doubtful your supply will handle the load.

In short - as @BobaMosfet said - "No." Simple as that. It's not going to happen. Not unless you have access to superconductors and ultra high efficiency equipment. We're talking thousands of dollars to make this happen. In reality - it's not going to. In theory - yes - there's a way. But in the practical world - "No."
 

Tonyr1084

Joined Sep 24, 2015
7,905
I don't know how the '10W' rating of that power supply was arrived at. 200mA x 43V = 8.6W. :rolleyes:
I had the same thought. However, 200 mA is fairly close to 233 mA, and whomever sold the supply wanted to round off the wattage in favor of their product. Just like Briggs and Scrapiron gasoline motors were over-rated. Claims of 6.6 HP gas motors were really pushing around 5 HP. They got in trouble for that. And that's a blatant example of exaggeration.

As regards this supply - we know nothing of its manufacturer. I've seen blogs where they do a teardown of a cheap Chinese supply that claims a particular output but upon teardown examination it is clear the supply can not perform as advertised. Me thinks this supply may be bogus in its ratings.
 

Thread Starter

xxxyyyba

Joined Aug 7, 2012
289
I don't know how the '10W' rating of that power supply was arrived at. 200mA x 43V = 8.6W. :rolleyes:
It is about 200mA and also voltage range is maybe a little bit different, I don't have that generator in front of me to check for exact values but they are very close to what I wrote in first post.
 

Tonyr1084

Joined Sep 24, 2015
7,905
It would take 50 volts at 200 mA to give 10 watts. I wonder if that's the unloaded voltage of the supply under best circumstances.
 

Norfindel

Joined Mar 6, 2008
326
If it were a constant voltage supply, you could use a DC-DC converter to lower the voltage while providing a higher current (a buck converter). But even then, as they're NOT 100% efficient, you wouldn't have 10 watts available at the output, but maybe 7 or 8. And you would need to use a resistor to limit the current. And the converter isn't free, either. So, probably better to just buy the correct led driver.
 

BobaMosfet

Joined Jul 1, 2009
2,113
Thanks. I understand. I will probably find another source.
If you have the ability to put a circuit in between, I'm wondering if a BUCK circuit would work- lower voltage, but increase amperage- thoughts from anyone else (I've done lots of BOOST circuits, but never BUCK).
 
Top