Calculating the size of a resistor in a simple LED circuit

Thread Starter

RAMBO999

Joined Feb 26, 2018
259
Basically you can't measure the resistance of an LED.

Suppose you have a 3.8V LED in a simple circuit with a 9V battery supply.

You need to insert a resistor in series with the LED that will give you a voltage drop of 5.2V.

But you don't know the current that will flow through the circuit until you know the total resistance
of the circuit.

You can use a pot and adjust it from zero to determine the point at which the voltage drop across the LED is 3.8V
and then measure the current in the circuit. But you wouldn't need to calculate the size of the resistor then because
you would just measure the resistance across the pot.

But is there a way of actually calculating the size of the resistor required that will give you
the 5.2 V drop that you require?

I can't think of one. What's the normal practice?

Cheers
 

WBahn

Joined Mar 31, 2012
29,978
You decide how much current you want flowing in the circuit, knowing that the voltage across the LED will be about 3.8 V regardless of how much current is flowing (within reason).

So you know how much current you want flowing in a resistor that has to drop the remaining voltage at that current. Ohm's Law to the rescue.

You don't operate diodes at a target voltage drop (unless you are doing something out of the ordinary, which is sometimes the case). The amount of current that flows is highly sensitive to the voltage drop, which is another way of saying that the voltage drop is very insensitive to the current.
 

Dodgydave

Joined Jun 22, 2012
11,284
You don't calculate the voltage drop , you use the current through the led as they are current driven...

Example led needs 20mA and voltage to be dropped across resistor is 5.2v then resistor is ( 5.2v/.02) = 260 ohms @ 125mW
 

BobTPH

Joined Jun 5, 2013
8,812
An LED is characterized by a voltage and a current. Using both of these, and the supply voltage gives you a simple equation you can solve for the resistor. See if you can come up with that equation.

Bob
 

Bernard

Joined Aug 7, 2008
5,784
You get to determine operating current. Common value is 20 mA. but 10 may be enough. R= V/I, 5.2 / .01 = 520 or closest std. value 510.
If I'm going to have several parallel strings of series LEDs I sort LEDs by Vf
& put them into separate stacks separated by .1V. Then make up strings to have about = VF.
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
259
Thanks for the responses. It does make sense. A practical solution. So no formula required. I just apply the 3.8V to the LED, make note of the current and use that as the target current in the LED circuit and to determine the size of the resistor.
 

Deleted member 115935

Joined Dec 31, 1969
0
If you apply a voltage across a led without a current limit,

two possibilities.

a) Not enough voltage to light led, no current flows
b) enough voltage to light led, no current limit , infinite current flows for a few ns, then no more led !!

if you have data sheet, look for Vf at what current,
then ( your Voltage - Vf ) / current = resistance you need to put in series.

If you don't have data sheet,
have a guess at resistance, on high side,
try , see what voltage is dropped across your known resistor and you then can calculate current . ohms law ,
 

Martin_R

Joined Aug 28, 2019
137
Thanks for the responses. It does make sense. A practical solution. So no formula required. I just apply the 3.8V to the LED, make note of the current and use that as the target current in the LED circuit and to determine the size of the resistor.
Never,ever, just connect an LED to a voltage source, as stated above it may not light at all, or light very bright and burn out. Look at the data sheet for the LED that you have - many LEDs like to work with 20mA or so, but may be bright enough even at 5mA. To make life (really) simple, just calculate a resistor value ignoring the voltage drop of the LED, and calculate it for 20mA current flow. ie: 9volts/ 20mA = 450 ohm, so call it 470 ohm.This is the MAX current that can flow. Now connect the LED in series with it, at least your LED won't blow, and all will be safe. It'll give approx 11mA if the LED drops 3.8v.
 

BobTPH

Joined Jun 5, 2013
8,812
Thanks for the responses. It does make sense. A practical solution. So no formula required. I just apply the 3.8V to the LED, make note of the current and use that as the target current in the LED circuit and to determine the size of the resistor.
Have you ever actually looked at a datasheet? They do NOT state an operating voltage for the LED. They give an operating current and a range of voltages at which any given LED might pass that current.

If the datashhet lists this range as 3.4 to 3.8 V, and you apply 3.8V, it may well destroy half of your LEDs.

Bob
 

WBahn

Joined Mar 31, 2012
29,978
Thanks for the responses. It does make sense. A practical solution. So no formula required. I just apply the 3.8V to the LED, make note of the current and use that as the target current in the LED circuit and to determine the size of the resistor.
If you would read the responses that you got, you would see that you do NOT do this. If you take this approach, you stand a good chance of destroying the LED. You need to look at the datasheet for that model LED and use that to determine your target current. If you have some random LED that you got from somewhere, then you simply do not have a very good way of determining what the proper current is. About the best you can do is increase the current (and NOT by applying a naked voltage to the device) until you get to a point where you think it is what you want, keeping in mind that that might be too much and your LED might not last very long. So play it safe and keep the LED on the dim side.

For a long time it was pretty safe to assume that 10 mA to 20 mA was a good operating current for an LED. But today that assumption is not that safe unless you are working with an simple LED used as an indicator (and even that can be iffy). Today there are LEDs that are intended to work on less than a milliamp while others are designed to draw amps of current.
 

MrChips

Joined Oct 2, 2009
30,708
My solution is to experiment starting with a 10kΩ resistor in series with the LED.
Gradually reduce the value of the resistor towards 1kΩ and choose the resistance that gives you adequate brightness and not too bright.

Calculate or measure the current if that makes you happy.

The other way is, given a 5V drop across a 1kΩ resistor implies a current of 5mA. This should be ample in most situations.
 

Thread Starter

RAMBO999

Joined Feb 26, 2018
259
Have you ever actually looked at a datasheet? They do NOT state an operating voltage for the LED. They give an operating current and a range of voltages at which any given LED might pass that current.

If the datashhet lists this range as 3.4 to 3.8 V, and you apply 3.8V, it may well destroy half of your LEDs.

Bob
I am aware of that. It is written on the packaging. I used the 3.8V merely as an example. I have white, blue and green LEDs that are 3.2V - 3.8V although they exhibit slightly different currents at each voltage. I also have some red LEDs that are 1.8V - 2.2V which, as you might expect, as considerably lower when it comes to current. I am aware of datasheets. My query was not about the limiting characteristics of LEDs. I guess you could say it was about best pratice when determining the current to use in the calculation of resistors in LED circuits.
 

MrChips

Joined Oct 2, 2009
30,708
My best practice is I look at the brightness of the LED for a given voltage, series resistance, and application and I stick with the one I like.

In low power applications, I want the LED to consume the lowest power. Hence I pulse the LED for 1-10ms depending on the desired brightness I want to achieve.
 

crutschow

Joined Mar 14, 2008
34,281
I also have some red LEDs that are 1.8V - 2.2V which, as you might expect, as considerably lower when it comes to current.
You still seem confused.
The color of the LED has nothing to do with the desired current through it.
The current you want is determined by the current (not voltage) rating of the LED and how bright you want the LED to be.
The LED voltage is only used when you calculate the value of the resistor for the current you want.
 

WBahn

Joined Mar 31, 2012
29,978
I am aware of that. It is written on the packaging. I used the 3.8V merely as an example. I have white, blue and green LEDs that are 3.2V - 3.8V although they exhibit slightly different currents at each voltage. I also have some red LEDs that are 1.8V - 2.2V which, as you might expect, as considerably lower when it comes to current. I am aware of datasheets. My query was not about the limiting characteristics of LEDs. I guess you could say it was about best pratice when determining the current to use in the calculation of resistors in LED circuits.
If you have the packaging, then you should also have the information that it gives about the current. THAT is what you need to center your design around. THAT is the best practice for determining the current to use.
 
Last edited:

BobTPH

Joined Jun 5, 2013
8,812
I have white, blue and green LEDs that are 3.2V - 3.8V although they exhibit slightly different currents at each voltage. I
You still appear not to understand the meaning of that range of voltages. Again, it does not mean they can be operated anywhere in that range. It means that, when operated at the stated current, the voltage drop will be somewhere in that range. In other words, if you have three LEDs of the same type and operate them at the given current, one might be 3.2V another 3.5V and the third one might be 3.8V.

Bob
 

MisterBill2

Joined Jan 23, 2018
18,168
If you have information such as the rated power (watts) or the rated current (Ma) and a nominal value for the voltage drop, then you can calculate the voltage drop required across the series resistor. And with the rated power and the nominal voltage drop data you can calculate the intended current fairly closely. I have a string of LEDs rated at 200mA, and at that current they are quite bright indeed. But since they are for general illumination that is reasonable.
 

Tonyr1084

Joined Sep 24, 2015
7,852
(+V -Vf)÷(desired current) = needed resistance.

Whatever voltage you start with (+V) you subtract the LED's Vf (forward Voltage). Then you divide that by whatever current you want to run the LED at. Keep in mind the data sheet. It will tell you a max current recommendation, which MIGHT be 30mA (0.03A). You choose what current you want to run through the LED. Your choice will depend on the desired brightness and whatever power source you're using.

Example:
(+24V - 3.2Vf) ÷ 30mA (0.03A) = 693.3Ω. That will conduct 30mA of current through the entire circuit, the resistor and the LED. But you also need to calculate for wattage.
24V x 0.03 = 0.72 (720mW). You can't use a quarter watt or half watt resistor, it will burn up. In this example you would need a 1 watt resistor.
And since you're not going to find a 693.3Ω resistor you'll probably end up using a standard value resistor. Closest to that is 680Ω. So at 680Ω the total current through the circuit will be (+24-3.2) ÷ 680 = 30.6 mA. Likely too much for the LED. You would choose the next higher resistance, which is likely to be 1KΩ. That will give you 20.8mA.

Now, if you want to put two resistors in series you could use a 470Ω and a 270Ω for a total of 740Ω. There you would have 28mA.
Keep in mind these are all examples. But now you should know how to calculate the proper resistor value and wattage. BOTH are important.
 
Last edited:
Top