How do you know what resistor to use in a simple LED circuit ?

Thread Starter

Berticus81

Joined Aug 4, 2022
2
Greetings,
I am playing around with leds and I would like to try to build my own led flashlight. If I am using ohms low to find the resistance and I end up with something like 6.17 ohms after the calculation, what resistor should I use? I don't think I'll find a 6.17 ohm resistor to buy. Do you round down to 6?
Thanks.
 

bertus

Joined Apr 5, 2008
22,120
Hello,

Could you shows us your circuit with the calculations?
Resistors come in "standard" values.
There are several E-series:
vishay_color_E_series_big.png
Take the next higher resistor value.
Taking the next higher value will lower the current a bit, but will be safer for the led.
6.2 will be close the to found value.

Bertus
 

Irving

Joined Jan 30, 2016
3,190
Yes, down or up to the next standard value. Resistor values go in ranges known as the 'E' series. The most common being E24 as shown. For your requirement it would be 6.2ohm as the nearest. Don't forget to calculate the wattage of the resistor as this will determine the physical size.

1665581459527.png
 

Thread Starter

Berticus81

Joined Aug 4, 2022
2
Thank you all for the replies.
I don't have the stuff in front of me, but I think the led I was planning to use was a 5V 0.5W .

Here is another question: If I am using a constant current or constant voltage device/component, do I still need a resistor?

Thanks
 

Dodgydave

Joined Jun 22, 2012
10,591
If you're using a Constant current supply , you won't need a series resistor, just make sure the current doesn't go over the maximum limit.
 

MrChips

Joined Oct 2, 2009
27,702
Let’s say your LED needs at least 2V to operate.
Use a supply voltage that is greater than twice 2V, i.e. greater than 4V.
Start with a resistor value of about 10kΩ.
Gradually reduce the resistor value until you get a comfortable brightness from the LED.
Done.
 

Papabravo

Joined Feb 24, 2006
19,614
If you are going to use a resistor, you choose a value to set the approximate current. To convince yourself, compute the current for one value up or down from the chosen value and look at the change in current. Once you choose a resistor to set the current, be sure to compute the power dissipation in the resistor to make sure the resistor can handle it. Chose a resistor with a rating about twice the expected dissipation.
 

Tonyr1084

Joined Sep 24, 2015
7,204
I'm going to question your conclusion that you need a 6Ω resistor.

I'm assuming a voltage of 2.125V with a 2Vf, leaving 0.125V. Dividing that by 6Ω I come up with 25mA. I'm imagining a regular LED, not a COB or a high wattage LED.
In post #7 you say 5V source and 500mW (actually you said 0.5W; same thing). 500mW ÷ 5V = 100mA. We can't be discussing a simple LED.

We need more information before we can tell if you've got your numbers right. To achieve 100mA with 5V you'd need a 50Ω resistor. At least if my numbers aren't wrong, which does happen from time to time. I still don't think you've done the math correctly.

What voltage?
What is the forward voltage of the LED?
What's the desired current through the LED?
 
Top