Question about LEDs and Current Limiting Resistors

Discussion in 'General Electronics Chat' started by cyberalchemist, Aug 15, 2010.

  1. cyberalchemist

    Thread Starter New Member

    Aug 12, 2010
    I was under the impression that when using a current limiting resistor in series with an LED it should be between the positive side of the voltage source and the anode of the LED. Recently I've seen some schematics where a resistor was placed between the cathode and the ground. I'm curious which is considered the best approach and why.
  2. hgmjr


    Jan 28, 2005
    The order of the LED and its series current limiting resistor in not critical. If the series resistor was connected to ground then I would expect that the led was being driven from its anode rather than its cathode end.

  3. Externet

    AAC Fanatic!

    Nov 29, 2005
    It works equally well at any end of the led because it is in series anyway.
  4. marshallf3

    Well-Known Member

    Jul 26, 2010
    Matters not. I tend to put the resistor to the Vcc supply (or most positive point) in a circuit but may change if it make the PC board layout better.

    I was trying to figure out a joke about the side depended on what hemisphere you were in due to the coreolis effect but I'm terrible at writing jokes.
  5. Wendy


    Mar 24, 2008
  6. SgtWookie


    Jul 17, 2007
    I prefer to have the resistor on the side of the LED that is electrically furthest from ground. This is because I'm somewhat of a klutz. If I happen to drop a pair of tweezers, piece of wire, or otherwise manage to short the junction of the LED and resistor to ground, it is unlikely that the resistor will be damaged since I always at least double the wattage requirement.

    However, if the cathode of an LED were grounded when the limiting resistor was between the cathode and ground, the LED could be blown instantly; at the very least it's service life would be diminished.