LED connection question

Discussion in 'The Projects Forum' started by charlietm, Feb 20, 2007.

  1. charlietm

    Thread Starter New Member

    Feb 20, 2007
    3
    0
    Hi.
    I made a little lamp with bright LEDs using a 110V to 12V transformer and the right resistances for my LED setup. After a while some LEDs stopped working. Am I missing something? Why might this have happened?
     
  2. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    It sounds like the LED devices were overstressed.

    Can you provide us with a copy of the circuit that you have so that we can assist you with diagnosing the problem?

    hgmjr
     
  3. n9352527

    AAC Fanatic!

    Oct 14, 2005
    1,198
    4
    Did you rectify the AC?
     
  4. beenthere

    Retired Moderator

    Apr 20, 2004
    15,815
    282
    n9352527 is correct - LED's do not make rectifier diodes. You have to place a "real" diode, like a 1N4001, is series to protect the LED's.
     
  5. charlietm

    Thread Starter New Member

    Feb 20, 2007
    3
    0
    Power source: AC Adapter
    Input: AC120V Output: DC12V 200mA
    Loop 1: 82 Ohm resistor, 3 White super bright LEDs (3.6V forward @20mA) in series.
    Loop 2: 270 Ohm resistor, 2 White super bright LEDs (3.6V forward @20mA) in series.

    Loop 1 stopped working but Loop 2 is still working.

    To rectify the AC should I place a "real" diode connected in series after the power source? One diode per loop?

    Thanks guys!
     
  6. nomurphy

    AAC Fanatic!

    Aug 8, 2005
    567
    12
    The negative excursions of the AC waveform (17Vpk = 12Vrms * 1.414) may exceed the reverse voltage rating of the LED's. Because the LED's are in series, you may be getting away with it working (somewhat).

    Adding a Schottky diode such as a B340A in series with each loop would resolve the problem with the least voltage drop. However, any old 1n400x or 1n4148 / 1n914 from a junk drawer would work as well -- but, you may need to adjust your resistor values to compensate for the diode voltage drop in order to maintain the present brightness level.
     
  7. wireaddict

    Senior Member

    Nov 1, 2006
    133
    0
    Since you're using a wall wart 12 VDC power supply, reverse voltage across the LEDs can't be your problem. Measure your DC output voltage; it's probably over 12 V for one thing. I've seen them run as high as 17 V, open circuit. Another thing that's probably happening is that, with the LEDs in series, one may be drawing more current & thus dropping less voltage than the one in series with it. This would impose higher voltage across the other LED.

    Try putting a 510 ohm, 1/2 W resistor in series with every LED & connect each LED-resistor pair in parallel across the 12 V. Depending on the output voltage the dropping resistance value may need to be increased.
     
  8. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    A recent thread on LEDs might give you a few ideas on driving LEDs using a constant current source.

    hgmjr
     
  9. nomurphy

    AAC Fanatic!

    Aug 8, 2005
    567
    12
    I ignorantly jumped on the bandwagon regarding the AC voltage, but "hgmjr" was astute enough to point out that your output is rated as 12VDC. This makes adding any diodes nearly pointless (although it could drop the voltage enough to improve performance, see below).

    Use a volt meter or DMM to measure the actual voltage output, if it's as high as 17V then consider:

    LOOP1:
    1.2V = 12V - (3.6V x 3)
    14.6mA = 1.2V / 82 ohms

    LOOP2:
    4.8V = 12V - (3.6V x 2)
    17.7mA = 4.8V / 270 ohms

    However:

    LOOP1:
    6.2V = 17V - 10.8V
    75mA = 6.2V / 82 ohms

    LOOP2:
    9.8V = 17V - 7.2V
    36.3mA = 9.8V / 270 ohms


    Depending on the output voltage, you may be burning up your LED's -- especially on loop 1. You need to check the forward current spec for the LED, and adjust the ballast resistor value accordingly for the ACTUAL output voltage.
     
  10. charlietm

    Thread Starter New Member

    Feb 20, 2007
    3
    0
    Thank you guys.

    I was also wondering how does the current (mA) of a power source (like the 12VDC "wall wart" I'm using) affect an LED circuit?
    I know that when determining the resistor values the only current used for the calculation is the LED's forward current but not the source's.
    Thanks.
     
  11. kubeek

    AAC Fanatic!

    Sep 20, 2005
    4,670
    804
    The source´s current is its maximal capability to provide the current, it doesn´t say anything about the current that really flows there.
     
  12. Ron H

    AAC Fanatic!

    Apr 14, 2005
    7,050
    657
    The current rating of the source tells you the maximum current you can draw from it without damaging it. The load (your LEDs) will determine the actual current.
     
Loading...