Beginner question about resistance

Thread Starter

phase ghost

Joined Mar 3, 2010
14
I'm trying to understand how to calculate resistance for a basic circuit, but I'm confused how I get from one voltage/amp to another. For instance:

I have a (a) 9v/650ma source and I want to get it to (b) 2v/20ma before an led.

Using ohm's law I get 13.846 ohms from (a) and 100 ohms from (b). I'm not sure what size resistor to use to get the input voltage and amps to the desired state though. Would a 100 ohms resistor do it? I've done a bunch of exercises to find out ohms from voltage and current and vice versa, but getting from point a to point b is baffling. thanks!
 

Thread Starter

phase ghost

Joined Mar 3, 2010
14
Pardon my extreme beginner question, but I think I may have be getting somewhere.

If I want 20ma to be the result do I use ohm's law and divide 9v by 20ma to get the proper resistance needed? I got 450Ω as my answer.
 

bertus

Joined Apr 5, 2008
22,278
Hello,

The values you calculated are the minimum load resistance for the powersupply
and the internal resistance for the led at 20 mA.

To calculate the resistor needed to connect the led to the powersupply with the following calculations:
Seriesresistor = (powersupplyvoltage - ledvoltage) / ledcurrent.
I your case it would be (9 - 2) / 0.02 = 7 / 0.02 = 350 Ohms.
The nearest standard values will be 360 or 390 Ohms.

Here is a drawing how things are connected:



Greetings,
Bertus
 

Attachments

beenthere

Joined Apr 20, 2004
15,819
Here is a link to the Ebook chapter that covers LED's - http://www.allaboutcircuits.com/vol_3/chpt_3/12.html

Briefly, you use the source voltage and the LED conduction drop. 9 volts (source) minus 2 volts (conduction drop of LED) leaves you 7 volts to consider. Limiting current to 20 ma needs a resistance, given by R = E/I. 7 volts divided by .02 amps is 350 ohms. The closest safe standard value is 360 ohms, so that is the resistor to use.
 
Top