I've seen several posts that are very close to this, but not quite answer my question. I have a circuit with a 3v source. I have a single LED with a drop of 1.65v. Then I have a 2k Resistor. The LED lights up fine.
Question: when calculating ohms law in this circuit I would have guessed the circuit was drawing 1.5mA (3v/2k=1.5mA). When I checked with a meter it was only pulling .6mA!!! So I was completely lost. I tested the resistance in the circuit double checked the voltage of the battery... everything. Then I did a google and found that you have to subtract the Vf of the LED. In my case around 1.65V. So now I get 1.35/2000 = .675mA!!! Got it! BUT... why do I subtract the LED? Shouldn't my first calc have been correct? Is this something special for just LEDs or what. When do I know to subtract a Vf value from a series circuit and when not to? I hope my question makes sense.
Hope you can help me understand! Looking forward to any pointers/tips!
Question: when calculating ohms law in this circuit I would have guessed the circuit was drawing 1.5mA (3v/2k=1.5mA). When I checked with a meter it was only pulling .6mA!!! So I was completely lost. I tested the resistance in the circuit double checked the voltage of the battery... everything. Then I did a google and found that you have to subtract the Vf of the LED. In my case around 1.65V. So now I get 1.35/2000 = .675mA!!! Got it! BUT... why do I subtract the LED? Shouldn't my first calc have been correct? Is this something special for just LEDs or what. When do I know to subtract a Vf value from a series circuit and when not to? I hope my question makes sense.
Hope you can help me understand! Looking forward to any pointers/tips!