LED constant current driver... voltage matter at all?

Discussion in 'General Electronics Chat' started by s_mack, May 14, 2015.

  1. s_mack

    Thread Starter Member

    Dec 17, 2011
    187
    5
    I know I could teach myself the answer to this by experimentation, but I think this is a simple question that can get a simple answer faster than me cleaning off my very messy desk :)


    Rather than a simple current limiting resistor, I'm using a constant current driver IC for an application. The datasheet says input voltage can be 4.5v to 28v with output voltage "up to 40v". The circuit currently uses 12v input but in a redesign I'm considering grabbing power from a regulated 5v instead of directly from 12v (ish) battery. Can I expect brightness to remain the same (given all other parameters stay same)? I think so.

    Sorry, its been awhile since I dusted off a textbook.

    Thanks.
     
  2. crutschow

    Expert

    Mar 14, 2008
    13,018
    3,235
    What output voltage do you need?

    You stated the output voltage can be greater than the input voltage. Is this a switching type constant current driver?
     
  3. cmartinez

    AAC Fanatic!

    Jan 17, 2007
    3,573
    2,542
    What IC constant current driver are you going to use?
     
  4. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,542
    1,251
    Datasheet!!!
     
  5. crutschow

    Expert

    Mar 14, 2008
    13,018
    3,235
    Yes. It's not fair when you know the part you are using and we don't. :rolleyes:
     
  6. s_mack

    Thread Starter Member

    Dec 17, 2011
    187
    5
    Bah... fine, I'll stop being so lazy. But then I have to admit I made up the figures* in my OP <grin>

    Its been a long time since I looked at this circuit really. I didn't think the particulars mattered so much wrt the question. I was more asking on theory.

    datasheet

    * Considering I was going by memory... I wasn't TOO far off :) input 5v to 24v (I said 4.5 to 28) and output 0.8V to 30V (I said up to 40V).
     
  7. bertus

    Administrator

    Apr 5, 2008
    15,648
    2,347
    Hello,

    The supply voltage must be at least 1 volt higher as the needed voltage accross the leds.
    The current measurement resistor will drop about 0.6 Volts and the transistor will need some room to regulate.

    Bertus
     
    mcgyvr likes this.
  8. s_mack

    Thread Starter Member

    Dec 17, 2011
    187
    5
    Sure... and I can use a different IC (another reason I hesitated posting particulars). Question remains... does input voltage affect brightness with constant current driver? Again, I wouldn't think so (given "constant current") but just wanted to check in case I'm missing something in the theory.
     
  9. dl324

    Distinguished Member

    Mar 30, 2015
    3,246
    622
    A constant current driver will work as long as the volage is high enough. The datasheet should give minimum voltage requirement.
     
  10. Stuntman

    Active Member

    Mar 28, 2011
    181
    47
    I believe I may know what you are asking. Realize this:

    Let's assume you use a 2Vf LED and want 20mA of current. You use a 20mA constant current sink (linear) to achieve this.

    You supply the LED with 10V(anode) then run the cathode to the output of the sink. The LED illuminates and you check the voltage across the LED: Hmm, 2V. You look at the voltage at the cathode (the output from the current sink) with respect to ground... 8V. So from our 10V supply to ground, the LED has a 2V voltage drop and the LED driver and 8V drop.

    Lets do some math, a 10V supply supplying 20mA is .2W (10V*.02A). The LED is consuming .04W (2v * .02A) and your current sink is dissipating the remaining .16W (8V * .02A). All is well.

    Now you swap the supply with a 20V supply. Again, you check the voltage across the LED... Again 2V. Now you check voltage between the current sink and ground, this time its 18V.

    Again, the math tells it all. 20V * .02A = .4W. The power consumed by the LED is again .04W (2V * .02A). However, the power dissipated by the sink is now .36W (18V * .02A). So the LED will see no difference in voltage, current, and therein, power consumption. What you will notice, is the current sink may start getting warm from all the extra power it is having to dissipate (waste).

    Finally, let's swap the power supply again, this time to a 5V unit. Again, the LED will see 2V across it, but the current sink will show 3V WRT ground. The math: PSU 5v *.02A = .1W. LED is again .04W, and the current sink will be 3V * .02A= .06W. If you are following along, you will also notice that this is also the most efficient way of driving the LED's, as only .06W is being dissipated (heat) from the current driver (much less than half of the power dissipated in the 10V circuit).

    The key here, is when you are regulating the current, you are regulating the voltage to do so. When switching to a 20V supply, the current sink has simply adjusted its impedance to match your current specs (.02). In doing so, the change in voltage drop across that sink ensures the LED has no change in performance.
     
    Last edited: May 14, 2015
  11. ian field

    Distinguished Member

    Oct 27, 2012
    4,415
    784
    Brightnes is more or less proportional to current - but efficiency (and brightness) tail off if you force too much current.

    There is a small spread of Vf between devices due to tolerances, as long as you stay within the specified current; the LED will find its own level.

    The theoretical ideal constant current source is an infinite voltage dropped by an infinite resistance - a variation in the voltage developed across the load would have to be pretty large to make the current change any.

    At a supply voltage as low as 4.5V - if you were using a current limiting resistor; a very small variation in load voltage will have a significant effect on current.

    IMO: even a made for the job current control chip will struggle with a very small differential between input and output.

    Look up the Vf for the LED you're driving and subtract that from the supply voltage you intend to use - then make sure the chip can work with so little voltage difference.
     
  12. mcgyvr

    AAC Fanatic!

    Oct 15, 2009
    4,770
    970
    This is whats important here..
    Your Vin must be greater than the sum of the Vf of the LEDs..
    Example.. If 5 x LEDs (with a 2V Vf) then you must use a 5x2 = 10 +1 = 11V input power supply..

    Brightness is dictated by the current the LED sees.. But you must have enough voltage to overcome the forward voltage of the LEDs for them to even light up in the first place..
     
  13. crutschow

    Expert

    Mar 14, 2008
    13,018
    3,235
    An opamp-transistor current source can operate with a voltage drop equal to the saturation voltage of the transistor plus the voltage across the shunt resistor. This total voltage can be less than 100mV for a properly designed circuit.
     
  14. AnalogKid

    Distinguished Member

    Aug 1, 2013
    4,542
    1,251
    No.

    ak
     
Loading...