Importance of Resistor

Thread Starter

mageshbilingual

Joined Mar 18, 2022
3
Hello connections!

We are using resistor to control the current flow in a circuit to prevent the damage for the electronics components.
Is it good approach to remove the resistor in a circuit and reduce the voltage source from 12V to some 5V(example case).
Example:
Option A : 12V DC supply + Resistor + LED
Option B : 5V DC supply + LED (Resistor is removed and i have reduced the voltage source from 12 to 5).
which one would be a good option? Please provide your thoughts to clarify my basic doubt.
please excuse me, as its too basic doubt.
 

Papabravo

Joined Feb 24, 2006
22,058
Hello connections!

We are using resistor to control the current flow in a circuit to prevent the damage for the electronics components.
Is it good approach to remove the resistor in a circuit and reduce the voltage source from 12V to some 5V(example case).
Example:
Option A : 12V DC supply + Resistor + LED
Option B : 5V DC supply + LED (Resistor is removed and i have reduced the voltage source from 12 to 5).
which one would be a good option? Please provide your thoughts to clarify my basic doubt.
please excuse me, as its too basic doubt.
It is rarely a good idea to connect a voltage source of any magnitude to a current device like an LED. The purpose of the resistor is to turn a voltage source into a limited current source. It may not be the only way to do it, but it is simple, effective, and cheap. Why do you have a problem with this? Do you know something the rest of us have missed?
 

Alec_t

Joined Sep 17, 2013
15,105
Welcome to AAC!
LEDs should almost always have a resistor in series to control the LED current. LEDs are current-driven devices.
Connecting 5V directly to a normal LED will normally result in a bright flash and a dead LED.
 

crutschow

Joined Mar 14, 2008
38,325
An LED is a diode that has a nominal forward voltage when operating, which is some different for different LED colors, but varies only slightly with a change in current, thus any small change in applied voltage results in a large change in current.
That's why you need a resistor (or other type of current limiter) to control the current through an LED.
 

Audioguru again

Joined Oct 21, 2019
6,826
Nobody makes a 5V LED, instead they made simple 5V incandescent light bulbs.
A white LED has a range of voltages from 2.8V for some to 3.6V for some and when you buy it you do not know which voltage it needs. Also, its required voltage changes when the current and temperature change.

An LED must be fed a current that is limited with a series resistor or circuit.

With a 5V supply and a typical 3.2V LED then the resistor to limit the current to 20mA is (5V - 3.2V)/20mA= 90 ohms.
If the LED is actually 2.8V then the 90 ohms resistor limits the current to (5V - 2.8V)/90 ohms= 24.4mA.
If the LED is actually 3.6V then the 90 ohms resistor limits the current to (5V - 3.6V)/90 ohms= 15.6mA.
 

MrChips

Joined Oct 2, 2009
34,629
An ideal voltage source has zero resistance. It can supply a constant voltage independent of the load resistance and current.
An ideal current source has infinite resistance. It can supply a constant current independent of the load resistance.

An LED requires a constant current.
Hence the closer you can come to an ideal current source the better it is for the LED.
Thus one design criteria is to make the series resistance as high as possible within the limitations of other constraints.
 

Tonyr1084

Joined Sep 24, 2015
9,744
I would also like to welcome you to AAC.

In your question you focus on the importance of resistance. Then you gave an example of when using an LED, which many have already clarified the reason for using a resistor with an LED. While it is true there are other means of controlling current to LED's and other components, the basic reasons for using resistors are many. Depending on how you design your circuit, resistors can be used to steer voltages in different amounts in different directions (such as voltage dividers) or can steer currents to different devices that may require limited amounts of current. Using two different LED's (as you've chosen for an example) can have different types of LED's and run at different levels of brightness. Using a super bright LED as a warning light and a much smaller and simpler indicator such as a common green LED will need different levels of current. The super bright LED might run at 25mA whereas the indicator may run at 10mA. All running from the same power source, one can indicate a condition you want to be alerted to while the other simply indicates the system is operating.

Resistors have many different applications. They can be used to bias a transistor or used as part of a feedback amplifier circuit. They can be used in conjunction with a capacitor to form part of a timing circuit. Without resistors electronics might not exist beyond the simple electric circuit.

Also, and you didn't ask but this is also important, resistors have limits to how much current they can handle. Depending on the voltage applied and the current drawn a resistor is rated in watts. Wattage is calculated as the multiplication of voltage and current. A 12 volt circuit running 30mA (0.03A) is running at (12 x 0.03 =) 360mW (or 0.36 watts). A resistor rated for a quarter watt (0.25W) would burn up. But if you were only running 6 volts at 30mA the wattage would be 0.18W, and the quarter watt resistor would be suitable in such an application.

And there is a lot more to resistors and resistance than I could ever cover. But the important thing is asking questions and learning. Even if you feel like you're asking a dumb question, the "Dumbest" thing you could do is not ask and remain uneducated. That's not to say if you don't know something you're dumb, no. We all had to learn something at some point. Before learning whatever it was we just didn't know. Before learning the speed of light - were you dumb? No. You just didn't know before then. This is why I like AAC so much. It's a great place to get ideas, learn new things and ask question to facilitate your learning. I respect the guys here very much as they know far more than I do. So when I come to the table with a thought and someone schools me - I'm happy to yield to their superior experience and knowledge, and willing to learn/apply new ideas. Because I know less, that doesn't make me dumb.

Again, welcome to AAC.
 
ElectricSpidey nails it from a power consumption point. Power dissipated in a resistor is converted to heat and is wasted (unless you wanted a heater). I think that your question was general, not just for LEDs. Using a resistor to drop voltage to reduce the voltage to a device is common and simple but has drawbacks.

First, it wastes power. If we are only talking about wasting a small amount of power, the simplicity may justify it. Even high-tech electronic devices are loaded with resistors but generally they dissipate very little power. Decades ago, huge power resistors were used to control power in a circuit and had huge losses but we didn't have the technology (or couldn't justify the cost) to do it in a better way.

Second, using a resistor to control power to something may not be successful if the load changes. If the load tries to draw more current, this increases the voltage drop across the resistor leaving less voltage at the load. If the load decreases, the voltage to the load goes up, possibly resulting in damage.

If you had a 12v source and wanted to power something that required 5 volts, you could determine the resistor value based on the current the device draws. If the load is 0.2 amps at 5 volts, we need to drop 7 volts in the resistor at 0.2 amps. Using Ohms law (R=E/I) R=7/0.2 says we need 35 ohms. The resistor drops 7 volts, the load drops 5 volts and all is right in the world. The power dissipated (wasted) in the resistor is 7x0.2 or 1.4 watts while the power actually used by the load is 5x0.2 or 1 watt. Total power consumed is 2.4 watts to power a 1 watt load - rather wasteful.

If you had a 5v source, connect it directly to the load. It will draw 0.2a for a total power consumed of 1 watt. Perfect.

If you are starting with 12v and feed it thru a voltage regulator (perhaps something like a 7805 fixed 5v regulator), 12v goes in, 5v comes out and feeds the load. The regulator drops the extra 7 volts at 0.2 amps. The regulator now dissipates 1.4 watts so you have wasted power again but the benefit here is that even if the load varies, you still have 5v to the load. If you have to drop 12v to 5v, the ideal thing to use would be a switching regulator which would be much more efficient (less power wasted and turned into heat) but now you have more complex circuitry and more cost.

LEDs are an odd animal because of how we are accustomed to thinking. Most devices are rated to operate from a specified voltage and due to their design, draw an appropriate amount of current when fed the proper voltage. You must think of LEDs differently; feed it the appropriate amount of current and it will drop its specified voltage.

In the case of LEDs, let's say that some particular LED drops exactly 3.0 volts at 20mA. In theory, you could connect it directly to an exact 3.0v supply. In the real world, this is going to fail. If the power supply put out more than 3.0v, the current to the LED is going to go up - probably by a lot! 3.1v would probably cause the LED to burn up since the current of the LED would rise dramatically with just a small increase in voltage. Components change in value with changes in temperature. The power supply output could vary with a change in temperature and so could the LED. I don't know for sure about LEDs but my guess would be that as the LED warms up, the Vfd (forward voltage drop) would go down so maybe now it is a 2.9v drop at 20mA so 3.0v would probably destroy it as well.

For small LEDs (like an indicator, not something that you are trying to light up a room with), using resistors is very common, cheap and works fine. Start with a voltage higher than the LED needs and drop the rest in a resistor. If you start with 5v and the LED is rated 20mA with a 3.0v drop, you need to drop 2v at 20mA so you use a (2/0.02) 100 ohm resistor which will dissipate (2x0.02) 0.04 watts. If the power supply voltage or the LED Vfd should change by some small amount, let's say 0.1v, the current would change slightly and could range from 19 to 21mA. Not a big deal in most cases. If the supply voltage were higher, let's say 12v, you would need 450 ohms and it would dissipate 0.18 watts - lots of waste but the regulation would be better. A 0.1v change in either the supply voltage or LED Vfd now results in a current range of 19.777 to 20.222mA.

Don't forget that resistors are not exact either! In the old days, a 20% tolerance was not unheard of. Today, 5% tolerance resistors are common. 12v supply to 3.0v led at 20mA needs 450 ohms. Allowing for +/-5% on the resistor you could have a range of 427.5 to 472.5 ohms resulting in a current of 19 to 21mA. Again, such a small shift would likely not be an issue but it is there.

For higher powered LEDs, it is common to use a switching regulator that is designed to output a specified amount of current so that small variations in the LED Vfd don't matter. Obviously more expensive and complicated than a resistor but much more efficient. A linear current regulator could be used but you are back to wasting a lot of power in heat. ericgibbs; I kinda see your point, a controlled current is what we need. A constant current is controlled or it wouldn't be constant. Sorry, let's not argue semantics, OK?

Sorry - got a bit long-winded here.
Rich
 

Audioguru again

Joined Oct 21, 2019
6,826
LEDs are an odd animal because of how we are accustomed to thinking. Most devices are rated to operate from a specified voltage and due to their design, draw an appropriate amount of current when fed the proper voltage. You must think of LEDs differently; feed it the appropriate amount of current and it will drop its specified voltage.
I disagree.
The wide voltage range for the forward voltage of an LED does not have a certain specified voltage like a light bulb has.
 

Papabravo

Joined Feb 24, 2006
22,058
...

LEDs are an odd animal because of how we are accustomed to thinking. Most devices are rated to operate from a specified voltage and due to their design, draw an appropriate amount of current when fed the proper voltage. You must think of LEDs differently; feed it the appropriate amount of current and it will drop its specified voltage.

...
You may be accustomed to thinking that way, but that does not necessarily apply to the rest of us. Considering the theorems of Thévenin & Norton, we can formulate the problem with either type of source and get the same answer. This is one of the interesting dualities of basic circuits.
 

Thread Starter

mageshbilingual

Joined Mar 18, 2022
3
It is rarely a good idea to connect a voltage source of any magnitude to a current device like an LED. The purpose of the resistor is to turn a voltage source into a limited current source. It may not be the only way to do it, but it is simple, effective, and cheap. Why do you have a problem with this? Do you know something the rest of us have missed?
Papabravo Thanks for your time in responding to my query.
for example if any fixed voltage battery is used in any circuit, instead of using 12V battery and using resistors in the circuit to limit the current flow, cant we use the low voltage battery without resistors to make use of the effective utilization of voltage sources?
it may sounds bad but wanted to clear my doubt. hence i am posting even if its below par.
 

Thread Starter

mageshbilingual

Joined Mar 18, 2022
3
I would also like to welcome you to AAC.

In your question you focus on the importance of resistance. Then you gave an example of when using an LED, which many have already clarified the reason for using a resistor with an LED. While it is true there are other means of controlling current to LED's and other components, the basic reasons for using resistors are many. Depending on how you design your circuit, resistors can be used to steer voltages in different amounts in different directions (such as voltage dividers) or can steer currents to different devices that may require limited amounts of current. Using two different LED's (as you've chosen for an example) can have different types of LED's and run at different levels of brightness. Using a super bright LED as a warning light and a much smaller and simpler indicator such as a common green LED will need different levels of current. The super bright LED might run at 25mA whereas the indicator may run at 10mA. All running from the same power source, one can indicate a condition you want to be alerted to while the other simply indicates the system is operating.

Resistors have many different applications. They can be used to bias a transistor or used as part of a feedback amplifier circuit. They can be used in conjunction with a capacitor to form part of a timing circuit. Without resistors electronics might not exist beyond the simple electric circuit.

Also, and you didn't ask but this is also important, resistors have limits to how much current they can handle. Depending on the voltage applied and the current drawn a resistor is rated in watts. Wattage is calculated as the multiplication of voltage and current. A 12 volt circuit running 30mA (0.03A) is running at (12 x 0.03 =) 360mW (or 0.36 watts). A resistor rated for a quarter watt (0.25W) would burn up. But if you were only running 6 volts at 30mA the wattage would be 0.18W, and the quarter watt resistor would be suitable in such an application.

And there is a lot more to resistors and resistance than I could ever cover. But the important thing is asking questions and learning. Even if you feel like you're asking a dumb question, the "Dumbest" thing you could do is not ask and remain uneducated. That's not to say if you don't know something you're dumb, no. We all had to learn something at some point. Before learning whatever it was we just didn't know. Before learning the speed of light - were you dumb? No. You just didn't know before then. This is why I like AAC so much. It's a great place to get ideas, learn new things and ask question to facilitate your learning. I respect the guys here very much as they know far more than I do. So when I come to the table with a thought and someone schools me - I'm happy to yield to their superior experience and knowledge, and willing to learn/apply new ideas. Because I know less, that doesn't make me dumb.

Again, welcome to AAC.
Thank you Tonyr1084 for your detailed explanation. i agree that asking question makes a better knowledge.
 

Tonyr1084

Joined Sep 24, 2015
9,744
Batteries have internal resistance. A car battery (a.k.a 12V) would way over power an LED. However, a coin cell battery has the ability to deliver low current. I've used LED's and coin cells for experiments without the use of a resistor. However, if a battery can deliver fatal current to an LED then a resistor IS necessary. Or some other means of current limiting. Using a 6V battery on a 3Vf (Vf = forward voltage, the amount of voltage the LED will drop) MUST have a resistor. Provided that the battery is capable of delivering more than 30mA (0.03A). IF the battery is strong enough then yes, a resistor MUST be incorporated into the design. (6V - 3Vf)÷30mA=100Ω. You would need a 100Ω resistor to protect the LED. That's if you wanted to limit the LED to 30mA of current. If you wanted 15mA, a more typical current, You'd use a 200Ω resistor. A 300Ω resistor would limit the current to 10mA. That is all based on a 6 volt battery or power supply. Failure to use a resistor means the death of the LED. UNLESS the battery can not deliver fatal current. Most common LED's have a max current rating of about 30mA. It's not recommended to run such an LED at that level. 15mA is more recommended. And based upon the need, you could get away with as little as 2mA. I've never seen an LED run at 1mA but in theory it is doable. The larger the resistor the lower the current.

I believe this has been mentioned before, "Wattage". You have to choose a resistor that is going to handle the heat produced. In the 6V with 3V LED example, 30mA passing through a 6V circuit means the resistor has to handle 180mW of heat. A 1/4W resistor would be needed. If you tried using a 1/8W resistor it would begin to burn up at 125mW. At 180mW it would smoke and burn. So don't forget to consider wattage when calculating your circuitry.
 

Papabravo

Joined Feb 24, 2006
22,058
Papabravo Thanks for your time in responding to my query.
for example if any fixed voltage battery is used in any circuit, instead of using 12V battery and using resistors in the circuit to limit the current flow, cant we use the low voltage battery without resistors to make use of the effective utilization of voltage sources?
it may sounds bad but wanted to clear my doubt. hence i am posting even if its below par.
You cannot do this safely and reliably. I'm not sure what is in your mind when you say: "effective utilization of voltage sources". There is no difference in "effective utilization" when using a current limiting resistor. If you think there is then you are trapped in a fever dream in a fantasy world.
 

eetech00

Joined Jun 8, 2013
4,704
for example if any fixed voltage battery is used in any circuit, instead of using 12V battery and using resistors in the circuit to limit the current flow, cant we use the low voltage battery without resistors to make use of the effective utilization of voltage sources?
it may sounds bad but wanted to clear my doubt. hence i am posting even if its below par.
There is a reason the resistor is called a "current limiting" resistor. LED's operate by using "current" from the battery, and without a resistor, the LED will consume all available current from the battery. This will usually burnout the LED.
 

Tonyr1084

Joined Sep 24, 2015
9,744
I'm not sure what is in your mind when you say: "effective utilization of voltage sources".
Not to put words in the TS mouth, I think what (s)he means is the resistor is consuming some "voltage" (actually current, but for the sake of argument). Use of a 3V power source on a 3V LED would seem to the TS that would utilize the "power" most efficiently; i.e. no wasted power (voltage). Again, I'm mixing up terms because I think this is the way the TS is thinking about electronics.

To address this notion: If you have a 3V power source capable of 100 amps and you put a 3V LED (which LED's are NOT "Voltage" devices) - it would blow out because of no limit to the current. LED's have a forward voltage ranging anywhere from 1.8Vf to 3.5Vf (and the range can be even wider, this is all I'm willing to call out for Vf). That just means they will drop that forward voltage. Whether you're powering it from a 3V battery or a 30V battery, the issue is the same; the LED operates based on quantum physics. In QP electrons orbit their protons and neutrons. They orbit at a given energy state. When you raise that energy to a higher level the electron wants to fall back into its normal state. When it falls back it gives off a single photon. Hence, the Light Emitting Diode. Current is what drives the electrons into a higher state. And they immediately want to fall back into their normal state, so they emit light. Think of current like a river. It's flowing. Think of a small creek. It's water level is like voltage. Compare that to a major river. The larger river has more potential even though the water may be flowing at the same rate. Voltage is akin to "Electric Pressure" and current is the flow of electrons being driven by that pressure. If you have a balloon full of water and you poke a pin hole in the balloon the water will squirt out. The more the pressure inside the balloon the harder the water will squirt out. In order to control the flow of electrons you need a resistor. The LED isn't a current limiting device it is a consumer of power (power is voltage times amperage, and in a DC circuit it is rated in watts). Let's not get off on a tangent. If you fail to limit the current you will burst your LED because the electrons can not handle that much charging and discharging. Yeah, I said something that could be misconstrued. I'm talking on a quantum level. Electrons can be charged to a higher state. And again, when they fall back to their normal state they give off the excess energy as light. And heat. And no - you can't use an LED as a battery. So lets not get confused.

1) You need enough electric pressure to light an LED
2) You need to limit the current so as to not burn out the LED

If you don't have enough voltage you will not fully light the LED. If you have too much current the LED will burn out. Those are the facts.
 

drjohsmith

Joined Dec 13, 2021
1,553
Just double check
there are LEDs that have a current limiter built in

https://uk.rs-online.com/web/p/leds/2285562

Leds, as others have said
generaly appear as a forward biased diode,
so unless you have one of the above types of LEDs or your own limiter, cheap option is a resistor,
a forward biased diode will take all the current it can till it burns out,
 

Papabravo

Joined Feb 24, 2006
22,058
Let us be clear. The problem with "matching" a LED to a voltage source is that forward voltage is not necessarily a well-controlled quantity. It is best to think of it, like all semiconductor parameters, as a random variable with a certain mean and standard deviation. Given those two numbers you can "forecast" a range for the random variable that will include 99+% of the units picked at random. If there is a mismatch between the forward voltage and the voltage source the current will be substantially in excess of what you calculate when they match, substantially less and the LED will fail to illuminate. In particular batteries do not maintain a constant voltage -- they discharge. The solution is to implement a method of lighting the LED that does not depend on any particular device parameter. This theme is present in almost all of the electronic designs you will ever see besides all the ones that you won't.

You can continue to pursue this nonsense if you choose, but you will eventually crash on the rocks of disappointment. You should think seriously about getting your head out of the dark place it is in.
 
Top