Powering LED's

Discussion in 'The Projects Forum' started by jj_alukkas, Mar 24, 2009.

  1. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    I have a simple question.. If I power a 2.1v LED from a 2v regulated source which is capable of supplying, say 3A directly, will it burn the LED off due to over current?? Or do I need a resistor however limited the voltage be??
    Will an LED consume the maximum power it can??
     
  2. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    If you measure your LED and its voltage is exactly 2.1V at 20mA then if it is powered from a regulated 2.0V supply, its current will be about 10mA to 14mA.
    Look at its datasheet.
    Here is the voltage/current graph for some LEDs I use:
     
  3. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    An LED is a regular diode, but with a higher voltage threshold due to the materials used to emit visible light. Above the turn on voltage, it is an electrical short circuit, so some form of current limiting is needed.

    If you connected a standard 1A diode across your power supply + and - so that it was forward biased, you would get about a 0.6V output, until the diode released the magic smoke that makes it work.
     
  4. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    So why am I not able to light a 3.4v blue Led at anything more than 3v from a LM317 regulator in voltage limiting mode? When I increase voltage, the LED lights for only a few minutes above 2.9v.. I asked the same thing sometime before in this forum, everybody said I should use the current limiting mode and then if its current was limited, it would light at 3.4v without stress. I am totally confused.
     
  5. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    The voltage drop across a diode increases with the current though it. This is more pronounced in LEDs.

    When measuring the minimum turn on voltage with a meter, the diode will faintly glow, so if the meter reads 2.1V, that is the voltage for turn on. For the next few tenths of a volt, it gets much brighter as the current increases, up to the maximum current. Above the maximum current, the LED will fail, either from the bond wire melting, or the bond itself cratering from heat.

    With a "normal diode", the voltage drop may show 0.5V at 1mA, but increase to 0.7V at 1A. After the initial drop at any voltage level, the diode will act like a short circuit.

    Look at the chart audioguru posted above, that may help explain it visually, notice the drastic change in current for minor changes in voltage once the turn on threshold is met. Since it isn't common to have a supply regulated to hundredths of a volt in common circuits, a current limiting resistor keeps the voltage in the correct range, if more current is drawn, more voltage is dropped, so it is self-correcting with a variable supply, to a point.
     
  6. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    So thats why I need a current limiter. And so my LED wont light at 3.4v without a resistor though its rated for so..
     
  7. GioD

    Active Member

    Mar 20, 2009
    30
    0
    The leds are current sensible. You have to watch the current. Put an instruments in series with the led, and measure the current. The voltage is a consequence, it vary due to several things. When using leds, the aim is to provide the correct current. If you have a 20mA led, simply use a resistor, if you have a 350mA led is necessary a current regulator. Never power a led without a current regulator circuit (resistor or regulator). Note that the power supply voltage must be greater than the led voltage, in order to allow current regulator to work.


     
  8. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    I get the point.. Thanks guys..
     
  9. Audioguru

    New Member

    Dec 20, 2007
    9,411
    896
    How do you know that your LED needs 3.4V?
    The voltage of an LED is a ranges of voltages. It might be 3.0V or 3.6V.

    Since you didn't limit the current and your LED turned off when you increased the power supply voltage then I think you damaged your LED.
     
  10. Wendy

    Moderator

    Mar 24, 2008
    20,765
    2,536
  11. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
  12. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    Everywhere on the web its says blue led's are 3.4v.. But I cant light them up at that voltage... we dont get the part number for the LED's to check for datasheets.. Benchtesting 1 or 2 from a set is the only solution.. And then I find it not 3.4 but ~2.9..

    ya I damaged them.. Sorry I didnt detail it..

    How can i find the exact voltage for an LED at its rated brightness level using any circuit other than datasheets and benchtesting ?

    Bill I liked that one really very much.. You are a real Hard worker.. Its worth being published as a book.. Only thing is the Blue LED voltage specified.. Are Those with lower voltages available??
     
    Last edited: Mar 26, 2009
  13. SgtWookie

    Expert

    Jul 17, 2007
    22,182
    1,728
    You can use an LM317 as a current regulator.
    Connect your power supply (I suggest around 7 or 8 volts) to the IN terminal.
    Connect a resistor (subsequently called R1) from the OUT terminal to the ADJ terminal.
    The ADJ terminal is where you get your current source.

    R1 = Vref / Desired Current, where 10mA <= Desired Current <= 1.5A.
    Conversely:
    Output Current = Vref / R1
    Vref is the voltage between the OUT and ADJ terminals when 10mA or more current is flowing through the regulator. You can measure Vref by connecting the ADJ pin to GND, a 100 Ohm resistor from OUT to ADJ, 4 to 6 volts (roughly) to the IN terminal, and then measure from OUT to ADJ. It should be in a range of 1.2v to 1.3v, nominally 1.25v.

    Note that when used as a current regulator, the LM317 drops at least 3v across itself. So, if you want to get the Vf of blue LEDs, you should use at least 7v for a voltage supply.

    Note also that as an LED warms up, it's Vf will decrease. If current is not regulated/limited, this can lead to a melt-down condition.
     
  14. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5

    Thanks SgtWookie.. This is one thing I needed for sometime..

    And that one, the most important of all.. I didnt realize it.. Thats my solution.. Thanks again..
     
  15. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    With LM317 in current regulation mode, what will be the voltage output?? 1.25v?? Is that enough for an LED to light up??
     
  16. thatoneguy

    AAC Fanatic!

    Feb 19, 2009
    6,357
    718
    The voltage varies between manufacturers, even for the same color. Use the "Diode Check" function of your multimeter to get the value. Straying from the minimum voltage very far usually leaves you with a DED, as you've noticed.

    Some cheaper multimeters won't read over 3V in Diode Check. If your meter doesn't read over 3V, switch it to voltage mode, and measure across the diode. Then add a 1k resistor in series with a 9V battery, read the voltage across the LED when the battery is connected and the LED is dimly lit. The 1k limits the current to at most about 1mA with a new battery. This might be above the minimum turn on voltage, but within 1/10th Volt.

    For "Safe", I usually run the colored lens LEDs around 2-5mA as indicators, and the clear lens/super-bright new ones at 20mA.

    All LEDs dim over their lifetime, but White LEDs are the worst. Running max current (~23mA) with White LEDs causes their color to go yellowish/dim faster, in 6 continuous months (~400 hours). The reason for this is the light is produced by phosphorescent coating on top of a UV diode, so that it lights up white, like in a flourescent bulb.
     
    Last edited: Mar 26, 2009
  17. SgtWookie

    Expert

    Jul 17, 2007
    22,182
    1,728
    In current regulation mode, it outputs current. :)

    Whatever current you have "programmed" the regulator to output (within limits; 10mA <= Iout <= 1.5A ) by connecting a resistor from OUT to ADJ. The regulator adjusts the output current to maintain the voltage between the OUT terminal and the ADJ terminal.

    If you've programmed a regulator that has a Vref of 1.25v to output 20mA with a 62.5 Ohm resistor from OUT to ADJ, and your LED will drop 2.323v across itself with a 20mA current, then you will measure 2.323v from the anode to the cathode of the LED.

    Don't forget that the LM317 in current regulation mode has a "dropout" of about 3v.
     
  18. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    If it takes the the voltage automatically, then its cool... Will try it out..
     
  19. jj_alukkas

    Thread Starter Well-Known Member

    Jan 8, 2009
    751
    5
    Ok I checked the drop over the LED's with a 1k and a 9v battery..
    White gave 3.1~3.3 at 6mA, Blue gave 3.0v at 6mA which I had mistaken for 3.4v..
    thatoneguy, thanks for your great Idea... Its a good and easy trick..

    I have some doubts, the white LED lit up full bright at 3.1v @ 6mA.. Is 6mA enough for this one?? Or is my meter reading wrong..? Blue also gave a good light at 6ma.. The same blue led abt 25 of them I have used on my car like underneons with LM317 at 2.9v but each draws 12mA.. Is it ok???
     
Loading...