# Resistor Voltage Limitation

Discussion in 'Homework Help' started by Advaka, Aug 26, 2016.

Aug 26, 2016
5
0
Hi, I have read about how to use a resistor in series with led. However, it is my understanding that the values plugged into ohms law (I and V) are values you want to flow with resistance that is result. I assume, if I want 2 amps to flow through and out of my resistor with an input of 10 volts, the equation gives that I should place a 5 ohm resistor. However, should the above be true I have come upon a problem.

An led has n given forward voltage (b, being the needed amperage). The voltage fed, should be atleast, n. I assume. However, all tutorials show that to get this, I must subtract the supply voltage, by the required forward voltage, and then divide by the amperage (b). This would then give me the needed resistor to place between my supply and my led. My confusion is that, this would give me a resistance to make sure the voltage after the resistor is supply minus n, and the current to be b. But! n does not equal the difference of supply and n voltage. I do not understand why this is used. Unless my understanding of the resistor and it's values are incorrect and voltage, current values of the resistor are what is to be dissipated (or taken from the supply) but if that's the case, we are only taking b current from the supply, and we need that as the minimum current. I hope someone can sort and teach me a solution for my confusion. Thank you.

2. ### JoeJester AAC Fanatic!

Apr 26, 2005
3,863
1,712
You have a Vs (Voltage source). You know the Vf (forward voltage of the LED), you know the current you want to flow in the LED.

Example:

Vs = 10V
Vf = 1.5V
I = 20 mA

Series resistor would be [ (10V - 1.5v) / 20 mA ] = 425 ohms. The nearest E24 Series resistor would be 470 ohms.

That reduces your current as [ (10V - 1.5V) / 470 ohms = 18 mA. I doubt you would perceive the difference between those two currents in most LEDs. I know there is a 420 ohm in the E24 series, but that would slightly exceed your design current of 20 mA.

You CHOSE the resistor based on the required current and voltage drop acrossed the resistor.

Aug 26, 2016
5
0
Why is it, supply minus forward drop? Isn't this the voltage you want to flow with this resistance and the current you want?

I had originally assumed its the voltage you want the resistor to consume, but if that were the case, it would only consume the small current provided. Its as if the voltage and current are for two different reasons (flow control/consumption by resistor)

Aug 26, 2016
5
0
I figured resistance was (voltage you want to flow / current to flow) but I knew that wasn't right. Then I assumed it was, resistance = supply voltage over current you want to flow. So why do they put the difference over the current we want to flow? Is the dissipation what flows through the resistor at the supply voltage? I thought the current drop over the resistor times the supply voltage was dissipation since that was released as heat in effort to lower the current

5. ### DickCappels Moderator

Aug 21, 2008
4,820
1,446
Voltage does not flow, it pushes. In a series connected circuit the current is the same for all components but each component has its own voltage drop. The sum of the voltage drops across the individual components in a series circuit is equal to the voltage of the voltage source.

In JoeJester's example in post #2, you have a 10 volt voltage source (such as a battery) and these 10 volts are distributed over the resistor and the LED.

LEDs have the property of having a nearly constant voltage across them over a wide range of currents. That is why JoeJester was able to go with the assumption that the LED will have 1.5 volts across it. That leaves the remaining 8.5 volts to be across the resistor.

The resistor only sees this remaining 8.5 volts -it has no way of knowing about the other 1.5 volts because that voltage is dropped across the LED.

The rest is Ohm's law, which you know.

Aug 26, 2016
5
0
So the difference of supply and forward drop in the resistor is covered by the resistor, where as if I had the difference of the voltages plus 1, the resistor would drop one more volt, pulling that from the led? Or the led would receive less current and maintain its forward voltage?

Aug 26, 2016
5
0
The voltage in the resistance calculation for the resistor, is the voltage drop over the resistor? Current being, what will flow, resistance being the condition in ohms that would cause such?

8. ### MrAl AAC Fanatic!

Jun 17, 2014
5,024
1,079
Hi,

The voltage across the LED is assumed to be nearly constant, within reason.
When we connect a resistor in series, we are hoping to drop some voltage that the supply has. If it has 10v and the LED has 3v, then we want to drop 7v. If we increase the 10v supply to 11v, then we want it to drop 8v. If we increase the supply to 12v, then we want it to drop 9v. We always want to have 3v for the LED.

Knowing this, we use the supply voltage minus the LED voltage divided by the desired current to get the right resistor value. But if we already chose the resistor, then we use the supply voltage minus the LED voltage divided by the known resistor value to get the current. So if the supply goes up, the current goes up.

The two equations used are both just Ohm's Law:

R=(Vcc-Vled)/i
i=(Vcc-Vled)/R

or we could just call Vcc-Vled the difference Vd, then:
R=Vd/i
i=Vd/R

9. ### JoeJester AAC Fanatic!

Apr 26, 2005
3,863
1,712
Here is the example I used. You will see as the Source voltage increases, the current increases. If you use a lower source voltage, compute the proper resistor for the current you wish to flow through the LED. I varied the voltage source from zero to ten volts.

Resistors are not limited by voltage or current. They are rated by the power they will dissipate.