Several days ago I read this post:
http://forum.allaboutcircuits.com/threads/why-does-the-current-get-low.127064/page-2#post-1035886
This part is what I want to talk about:
http://forum.allaboutcircuits.com/threads/why-does-the-current-get-low.127064/page-2#post-1035886
This part is what I want to talk about:
I said one of my friend that we can turn on an LED with 100V and we just need to keep current constant. even without resistor! I said him that it doesn't matter how much voltage you use. he said me you can't. I said him above example(I mean this part: (120 - 3) / 0.01 = 11,700) and he said you drop 117V when you use resistor. and he told me about something like gamma voltage and I cannot turn on an LED with each voltage. Is he right?LED's are not voltage devices they are current devices. If you had 120 volts and wanted 10 mA through it then (120 - 3) / 0.01 = 11,700 (ohms, or 11.7KΩ for a total of 0.01 amps.) The same LED would be just as bright in either circuit. But with such a high voltage, subtracting the forward voltage of the LED isn't going to make much difference. If I ignored the forward voltage of 3 volts then I'd be using a 120,000Ω (120KΩ) resistor. That'd give me 0.010256 amps, a difference of 256 micro-amps. Certainly nothing that's going to blow up anybody's circuit.