Power LED with 12v Power Adapter

Thread Starter

xxkinetikxx

Joined May 2, 2011
4
First off hello forum as this is my first post!

I'm in need of a bit of help. I have a 12v AC-DC power adapter that I'm looking to power up 6 LED's with.

The adapter specs read:
input: 120v AC 60Hz
output: 12v DC 300mA

Can I just calculate the voltage drop from the LED's and put a resistor in the mix to get it where I need it to be or am I missing something here?

In the past I've only ever powered LED circuits with batteries and never an adapter and would just like to be sure before I pop the LED's :eek:

Thanks in advance!
 

ErnieM

Joined Apr 24, 2011
8,377
Hello and welcome!

The simple answer is yes, you can do exactly what you ask, sum the drops of the LEDs, compute the difference, use ohm's law to find the value and power of your resistor, wire up everything in series, see light light and no smoke.

As far as I know a 12V adapter will come very close to the nominal 12V.

The complications may come from.... if you say have white LEDs that need 3V each then 6 of them need 18 volts, so you would have to make two series strings of 3 LEDs each.

The other thing that may be an issue is if the total of the drops of the LEDs gets close to the 12V then the current becomes very sensitive the the actual drop of the LEDs.
 

Adjuster

Joined Dec 26, 2010
2,148
First you need to decide how to arrange the LEDs. Unless their forward voltage is well below 2V, you will not be able to put all six in series with one resistor. You might need to connect them as two strings of three with separate resistors.

Do you know the LED current rating and forward voltage?

Also, unless you are sure that the mains adaptor is regulated, you may want to check its output voltage at the current expected to by taken by the LEDs, assuming this is less than 300mA.
 

Jaguarjoe

Joined Apr 7, 2010
767
A lot of times a regulated wallwart will have the word "regulated" right on it.
An unregulated 12v/300ma unit will put out ~15v unloaded, 12v @300ma, and ~9v when overloaded.
 

#12

Joined Nov 30, 2010
18,224
News Flash: LED's are not resistors. They are more like "avalanch" devices. When you get enough voltage to them, there is no current limit. Without a current limiting resistor, they promptly smoke.

Make 2 strings of leds, like you were told, or all you will have left is spots in front of your eyes.
 

#12

Joined Nov 30, 2010
18,224
The math is right, the devices are wrong for that math.

You can put several LED's in series, but they always need a current limiter.

It's like falling down the stairs. If you lean forward far enough, you will fall down. If there is more than one step (LED) below you (the voltage), you will just fall down farther.
 

Thread Starter

xxkinetikxx

Joined May 2, 2011
4
Gotcha. Well I'm in a bind then. I ran the math at ledcalculator.net and it said to wire it up in series with a 1 Ohm resistor so I picked one of those up.

Is that going to be enough?
 

#12

Joined Nov 30, 2010
18,224
You're still trying to use up (almost) all the voltage with LED's. It won't be reliable because the breakover voltage changes with temperature, color, and batch. (That calculator isn't as good as a real human.)

You need to waste at least a quarter of the voltage in the resistor so changes in current per changes in temperature are rather small. That's why everybody keeps telling you to make 2 strings of LEDs.

You can probably get away with wasting only 10% or 15% of the voltage in a resistor, if you hand pick the value, but I think the 1 ohm answer is going to smoke.
 

Adjuster

Joined Dec 26, 2010
2,148
@ xxkinetikxx: Think about it. The voltage drop on a 1ohm resistor at 20mA is V = I*R = 20mA*1Ω = 20mV.

This is only 0.17% of 12Volts, which really is nothing like enough compared with likely circuit variations. If the supply voltage is not exactly 12V, or the Vf for 20mA is not exactly 2.4V, you will have trouble.

Note also that Vf tends to get smaller as the LED gets warmer. Thus can lead to a dangerous condition called "thermal runaway", which might not be apparent until the circuit had been operating for some time.

A string of three 2.4V LEDs will require 7.2V, and so could be driven at 20mA from 12V from a resistor of (12V-7.2V)/0.02A = 240Ω.

If your supply went up to 15V on light load, the resistor would increase to (15V-7.2V)/0.02A = 390Ω

If this sounds too inefficient, consider a string of four LEDs (4*2.4V = 9.6V), but five is too many unless your supply is over 12V.
 

KJ6EAD

Joined Apr 30, 2011
1,581
Since you can't rely on the regulation of your supply, use an LM317L regulator in constant current configuration with a 68Ω 5% resistor for control on series strings of 3 or 4 each.
 

Audioguru

Joined Dec 20, 2007
11,248
You need to know the actual voltage of the wall adapter which will be much higher than its rating if it is not fully loaded.

Nobody makes "2.4V" LEDs. They have a range of voltages maybe from 2.0V to 2.8V.
Calculate a current-limiting resistor with 15% to 20% of the total voltage across it when all the LEDs are the minimum voltage and when all the LEDs are the maximum voltage.

LED-calculator programs have no common sense.
 

iONic

Joined Nov 16, 2007
1,662
You need to know the actual voltage of the wall adapter which will be much higher than its rating if it is not fully loaded.

Nobody makes "2.4V" LEDs. They have a range of voltages maybe from 2.0V to 2.8V.
Calculate a current-limiting resistor with 15% to 20% of the total voltage across it when all the LEDs are the minimum voltage and when all the LEDs are the maximum voltage.

LED-calculator programs have no common sense.
I can only go by the OP's statements of the LED's ratings. I do use Linear's LED calculator but usually input a LED current 2-3mA lower than it's rated value.
 
Top