Question regarding ohms law

Thread Starter

sebastianpatten

Joined Nov 28, 2010
18
This should be a simple one for you guys :)

I was looking at the first diagram on Chapter 1 here http://forum.allaboutcircuits.com/blog.php?b=378

And it has a source voltage of 9v and he wants to put a 2.5v led that has a current requirement of 20mA. Now I thought you had to use a voltage divider to get down to 2.5 volts (by putting 2 resistors in series), but he uses just one resistor.

How does it work? - i dont understand.
 

Wendy

Joined Mar 24, 2008
23,408
LEDs drop a fixed voltage, similar to zeners only forward biased. I have used them as a constant voltage source for some applications. You have to figure out how much voltage drop the LEDs will use, then figure from here. The exact voltage drop is extremely dependent on the color of the LED.

I gave a set of approximate voltages for various colors, but the ultimate authority is the data sheet of course. If you are designing on the fly it will get you into the ballpark, which is good enough for most cases.

This is what figure 1.2 was trying to explain. Vary the current between 1ma and 30ma, there won't be much change (there will be a little, but for the sake of argument I treat it as negligible).



The math variable for this is Vf, for LED forward voltage drop.
 

Von

Joined Oct 29, 2008
65
Biasing LEDs is pretty straight forward.

Subtract the Led's Vf from the Vsource.

Use E/I to determine R.

Choose the next highest commonly available resistor (330).

Enjoy.
 

Thread Starter

sebastianpatten

Joined Nov 28, 2010
18
Ohhhhh - I think i had the basics wrong in my head. I thought you had to get the voltage of the wire to 2.5 volts and the current to 0.02 amps or the led would pop :) I guess it doesnt work that way.

Is it correct that this happens then...If you need to power an LED, (depending on its tech sheet) it might require 0.02 amps, so you use a resistor to get 0.02 amps from the 9v power source. Then the LED just sorts itself out with regards to voltage. Im assuming this magical appearance of voltage on the LED is due to some internal resistance within the LED?

So could you just put an LED onto a 100V power supply? Assuming you used the correct resistor to get the current to 0.02A.

Thanks!
 

#12

Joined Nov 30, 2010
18,224
Yes. As long as you subtract the magical voltage of the LED then use the remaining voltage to calculate the resistor for .02 amps.

100-2.5 = 97.5V
97.5V/.02A = 4875 ohms.
5.1 k is the next larger value in the 5% category of resistors.
5.6 k is the next higher value in the 10% category.
 

Thread Starter

sebastianpatten

Joined Nov 28, 2010
18
Ok I see - two more questions and one follow up:

1) Does this logic apply to anything else, other than an LED? If so how would i tell/find out?

2) So does this mean I can get a max of 40 leds on my circuit in series? 100v / 2.5v = 40

Follow up: This magical voltage is from the leds resistance right?

Thanks
 

#12

Joined Nov 30, 2010
18,224
1) This logic applies to any semiconductor junction that "breaks over". There is a threshold voltage that must be overcome to get current to flow. It is not a resistance. The breakover voltage will not protect itself from excessive current.

2) you can not depend on LEDs to have exactly the same breakover voltage as every other LED in the bag. If all 40 LEDs were magically 2.500001 volts, nothing would flow. If they were all magically 2.499999 volts, there would be no resistance to limit the current flow and they would all burn up in less than a second.
 
Top