Help with Ohms Law & LED

Discussion in 'General Electronics Chat' started by c0de3, Jan 11, 2012.

  1. c0de3

    Thread Starter Active Member

    May 1, 2009
    50
    1
    I've seen several posts that are very close to this, but not quite answer my question. I have a circuit with a 3v source. I have a single LED with a drop of 1.65v. Then I have a 2k Resistor. The LED lights up fine.

    Question: when calculating ohms law in this circuit I would have guessed the circuit was drawing 1.5mA (3v/2k=1.5mA). When I checked with a meter it was only pulling .6mA!!! So I was completely lost. I tested the resistance in the circuit double checked the voltage of the battery... everything. Then I did a google and found that you have to subtract the Vf of the LED. In my case around 1.65V. So now I get 1.35/2000 = .675mA!!! Got it! BUT... why do I subtract the LED? Shouldn't my first calc have been correct? Is this something special for just LEDs or what. When do I know to subtract a Vf value from a series circuit and when not to? I hope my question makes sense.

    Hope you can help me understand! Looking forward to any pointers/tips!
     
  2. w2aew

    Member

    Jan 3, 2012
    219
    64
    The way to think about this is that the sum of the voltages around the loop will always equal zero. Starting at the negative end of the battery... You go up to 3V at the + end of the battery, then you drop 1.65 volts across the LED, this leaves the remaining 1.35 volts which appears across the resistor. It's a series circuit, so the current has only one path to follow, and it creates voltage drops across the components it flows through (the LED and the resistor). Make sense?
     
  3. c0de3

    Thread Starter Active Member

    May 1, 2009
    50
    1
    W2AEW,
    Thanks for responding. I understand everything you said... but I still don't know why the voltage of the LED is removed. Why isn't the drop of the resistors removed?

    To be honest my circuit is:

    P+---LED---R1k---R1k---N-


    So my voltage drops are measuring 1.65, .63, .63. Batt is only at 2.91 I'd say) How do I know that I only subtract the LED voltage drop. Why not the R voltages also? I guess it is hard to explain my question. Is it because LEDs are constant voltage? So anything in a series circuit that is constant voltage must be subtracted?

    Thanks again for help! I hope I understand soon!
     
  4. w2aew

    Member

    Jan 3, 2012
    219
    64
    The voltage drop across a resistor is DIRECTLY proportional to the current flowing through it - remember V=I*R. A diode is different. The voltage across a diode is NOT linearly related to the current flowing through it. It can be "approximated" as a fixed voltage drop for reasonable forward current levels. Since it is in series with the resistors, the voltage drop across the diode PLUS the drop across the resistors MUST equal the battery voltage.

    If the circuit only had a 3V supply and a 2K resistor, then you'd know that the resistor has 3V across it, and the current must be 3V/2000 = 1.5mA. But, since the diode (LED) is in series with the resistors, that takes away from the voltage that will appear across the resistors (because the SUM of all of the voltage drops equals the supply voltage). Since the voltage dropped across the LED is 1.65V, that leaves the remaining voltage across the resistors...

    It really comes down to this simply fact - that the SUM of all of the voltages around any given closed loop will always equal zero.

    V(bat) + V(resistors) + V(LED) = 0

    <be mindful of polarity> As you go around the loop, you could say that when you pass from the - to the + terminal of the battery, the voltage increases by 3V (positive), then the voltage decreases by 1.65V (negative) across the LED, leaving 1.35 for the resistors...
     
  5. c0de3

    Thread Starter Active Member

    May 1, 2009
    50
    1
    Thanks, it is making sense now. Seems to me a safe thing to do is solve all your voltages first. Then if you have a resistor in the circuit (or a group of them) you can solve I (current) for that resistor. Since the circuit is series you'll then know the current for all parts of the circuit. Not sure if I explained that right but it is making sense to me now. Thanks for the help.
     
  6. w2aew

    Member

    Jan 3, 2012
    219
    64
    Yep - sounds like you're "getting it".
     
  7. MrChips

    Moderator

    Oct 2, 2009
    12,449
    3,368
    The straight forward answer is... apply Ohms Law... I = V/R
    V is the voltage across the resistor R, not the voltage of the battery.
     
  8. Adjuster

    Well-Known Member

    Dec 26, 2010
    2,147
    300
    The voltages around any loop must sum to zero (if we take their directions into account). Formally, this is described as Kirchhoff's Current Law.

    If it were as you first thought, and the battery voltage and resistor defined the current, then the whole of the battery voltage would have to appear across the resistor. If the LED voltage were added on to it, the total would come to more than the battery voltage we started with, which is clearly impossible.
     
  9. c0de3

    Thread Starter Active Member

    May 1, 2009
    50
    1
    Wow, guys thanks a lot. I'm down with it now. Not sure why it was hard for me to understand that the 3v isn't all at the resistor. Some is over on the LED! I think I was looking at ohms law from the point of view of the entire series circuit, rather than individual components... not sure. Either way I get it now. Thanks for the help! Tonight it is on to Parallel!!! :rolleyes:
     
  10. hhhunzai

    Member

    Nov 14, 2011
    72
    0
    This is about the knee voltage of a diode.
    Every diode will have a particular knee voltage,
    diodes of silicon have o.7v knee voltage and germanium diodes have 0.3V.
    In this case, I think it is 1.65V. so the remaining will be after the knee voltage.
     
Loading...