What resistor value

Thread Starter

Heinz57

Joined Dec 16, 2009
24
Hi all,

I have 3 LEDs rated at 1.85 - 2.5v (max) and a current of 20mA. They are powerd from a 12v DC supply.

I just want to check which resistor values to use. I worked it out at 600Ω. Is this correct?

Cheers,

Heinz
 

SgtWookie

Joined Jul 17, 2007
22,230
You start out using their typical Vf @ current rating.

Then calculate:
Rlimit >= (Vsupply - (Vf_LED_total)) / Desired_Current (this is for three LEDs wired in series, not parallel)

You are better off to use too low of a Vf in your calculation than too high.

Is your DC supply regulated, or a "wall wart"? Wall warts are not typically regulated. If you are in doubt, measure the output voltage using a multimeter or DMM before you do anything.
 

Thread Starter

Heinz57

Joined Dec 16, 2009
24
I've just done:

(12 - ( 1.85 )) / 20

That's given me 0.5075. It doesn't seem right?

The supply is a 12v DC model railway controller.
 

SgtWookie

Joined Jul 17, 2007
22,230
Actually, it would be:
Rlimit >= (12 - (3*1.85)) / 20mA
Rlimit >= (12 - 5.55) / 0.02
Rlimit >= 6.45 / 0.02
Rlimit >= 322.5
322.5 is not a standard value of resistance. A table of standard resistor values is here:
http://www.logwell.com/tech/components/resistor_values.html
Bookmark that page.
330 Ohms is the closest standard value.
Re-calculating to determine current:
6.45v / 330 Ohms = 19.545...mA

calculate wattage requirement:
P = EI (Power in Watts = Voltage x Current)
6.45v * 19.546 = 126mW; we double this for reliability; 252mW. This is less than 1% over 1/4 Watts, so you can use a 1/4W resistor.

Now if you were running single LEDs with single resistors, you would calculate using just one LED's Vf as Beenthere did.
Note that the wattage requirement will be higher.
 
Top