LED question

Thread Starter

wallaby

Joined Jul 26, 2011
34
I hope this is a simple question:
I have several LEDs I want to use from a single voltage supply. I can do the math and find the required resistor values, but here comes the question: My voltage supply is set at a max of 3 volts. One of my LEDs has a Vf of 3 volts, so calculations show a resistor of zero ohms to be used. Is it ok to run this LED without a resistor... or should the supply voltage always be higher and them be choked down with resistors?

The next question deals with the resistors themselves: does it matter if I choose 1/8 watt, 1/4 watt, or 1/2 watt resistors? They may all be labeled as say 1k ohm, but seem to behave differently in the actual circuit.
 
My voltage supply is set at a max of 3 volts. One of my LEDs has a Vf of 3 volts... does it matter if I choose 1/8 watt, 1/4 watt, or 1/2 watt resistors? ...
The bottom line is that you can't use a 3V LED on a 3V supply, in fact even a 2V LED will be a problem. If you don't have a higher voltage available, you will need a boost circuit.

Yes, the power matters, calculated from the current through the resistor and the voltage across it.
 

wayneh

Joined Sep 9, 2010
17,496
Pretty much as noted. Any LED has a narrow range of voltage where it goes from, on the low end, no current or light, to the high end, too much current and rapid destruction. This range may be only 1/2V or less, and will change with temperature. That's why LEDs are usually used under current control.

That said, there are many commercial devices that rely on staying in that narrow voltage range. If your LED lights at all at 3V, you might be fine to direct connect it, especially if your supply is a battery. Small batteries are current-limited by nature.
 

Ogu Reginald

Joined Oct 13, 2011
6
it is ok to connect the LED Directly to the power supply. Please note that the power ratings of resistors matters alot and should be calculated by using either of the following: P=(V*V)/R or I*V or any formulas you can use to calculate power based on the parameters you have at hand.
 

w2aew

Joined Jan 3, 2012
219
I hope this is a simple question:
I have several LEDs I want to use from a single voltage supply. I can do the math and find the required resistor values, but here comes the question: My voltage supply is set at a max of 3 volts. One of my LEDs has a Vf of 3 volts, so calculations show a resistor of zero ohms to be used. Is it ok to run this LED without a resistor... or should the supply voltage always be higher and them be choked down with resistors?

The next question deals with the resistors themselves: does it matter if I choose 1/8 watt, 1/4 watt, or 1/2 watt resistors? They may all be labeled as say 1k ohm, but seem to behave differently in the actual circuit.
No, it's generally NOT OK to connect the LED directly across the supply. The LED is a diode. Like all diodes, once the forward voltage reaches the point where reasonable current starts to flow, any small increase in the voltage will cause a LARGE increase in the current. Plus, this characteristic is temperature dependent, and will also vary from device to device. Bottom line, without a current limiting resistor, you run the very real risk of putting too much current through the LED and destroying it.

The power rating of the resistor doesn't change how it will behave in your circuit, but it does determine if it will survive in your circuit. What you need to do is calculate how much power is going to be dissipated by each resistor, and then choose a resistor whose power rating is greater than the actual dissipated power. For example, let's say you are using a 220 ohm resistor to limit the current in an LED to 15ma. This resistor will dissipate about 50mW of power, so a 1/8W resistor would do just fine. It's a good idea to select parts with a 2x safety margin.
 

wayneh

Joined Sep 9, 2010
17,496
...any small increase in the voltage will cause a LARGE increase in the current
I don't disagree with the point but I do quibble with the language; "small" and "large" are not objective terms.

And in fact, within a range of about 500mV, current through the LED rises from 0 to 20mA or so, in a fairly linear way. You could just as well say 500mV is a "large" voltage change while 20mA is a "small" change in current. (Not so small to the LED ;))

The slope of the current-vs-voltage plot is not all that sensitive to temperature, either. If you can control voltage with ±10mV precision, you can control an LED.
 

w2aew

Joined Jan 3, 2012
219
I don't disagree with the point but I do quibble with the language; "small" and "large" are not objective terms.

And in fact, within a range of about 500mV, current through the LED rises from 0 to 20mA or so, in a fairly linear way. You could just as well say 500mV is a "large" voltage change while 20mA is a "small" change in current. (Not so small to the LED ;))

The slope of the current-vs-voltage plot is not all that sensitive to temperature, either. If you can control voltage with ±10mV precision, you can control an LED.
Agreed - but it takes more device knowledge to understand the specific V-I characteristics of the LEDs in question - more knowledge than a typical novice is likely to know - so I was just staying off of the potentially thin ice!
 

Thread Starter

wallaby

Joined Jul 26, 2011
34
Alright, not so stupid a question after all.
My project is a wind-driven generator and all it has to do is light up some LEDs. It has the potential to make more power, but of course it would also require more wind speed to do it. I have an adjustable voltage regulator in the circuit and figured my target voltage should be equal to the greatest Vf of my LEDs, and the other more efficient LEDs with lesser Vf would just require a resistor to keep them from overpowering.

It's a fine edge, because I want the lights to come on as soon as possible with minimal generator rpm. Right now my greatest LED Vf is 3.2V...should I ramp up my supply voltage to 4 volts? ..More?
Again, the higher I set the voltage, the faster the generator has to turn to get the lights to come on. Less is more in this case. I want a reliable circuit, but with minimal voltage requirements.
Thoughts?
 

Wendy

Joined Mar 24, 2008
23,415
The circuit I showed is pretty efficient, has been verified, and can be extremely bright. How stable is the voltage?

Large chunks of these circuits can be carved off to simplify it a bit.

Many people make the base mistake of assuming it is voltage that is important to LEDs, it isn't. It is current, but before you can pump the LED current you must meet the minimum voltage specs. After that is accomplished the voltage doesn't really matter, the current becomes the key parameter.
 

Wendy

Joined Mar 24, 2008
23,415
Yes, and this is a distraction. The batteries in question have an internal resistance that is used in lieu of an external resistance for the LEDs. If I feed those LEDs with a 4.5V power source that had several amps they would get very bright for a little while, then go dark.
 

Thread Starter

wallaby

Joined Jul 26, 2011
34
Ok, so the way I visualize this is that the resistors act as a buffer to stabilize the current?
I can dial the output voltage to match the LED, but because the generator output isn't very stable it needs the resistor to help smooth it?

Really, right now my resistor values are in the 10- 100 ohm range with the 3V supply.
Is there a minimum resistor value I should target, or a minimum supply voltage? I mean is there a rule of thumb for powering LEDs?
I thought I was making things simple by using a 3V supply: less energy wasted in resistors turning excess current into heat, and minimal rpm required of the generator to get the show going.
 

wayneh

Joined Sep 9, 2010
17,496
The general problem you're facing is, how to turn a VERY variable supply into a stable output. The buck-boost approach Bill offered is "state of the art" for exactly that application. The only other reasonable approach would be a current-controlled circuit, but that will take more wind to light up (I think?) and burn off more power.

One thing you might like to add is a bar-graph approach, where an increasing number of LEDs light up as the generator gets going and has plenty of power for them.
 

Wendy

Joined Mar 24, 2008
23,415
The resistor is not a buffer. It could be called a ballast, though personally I don't like the term. It limits the current. While LEDs drop a voltage, the actual voltage is not important, other than designing. The important number to LEDs is the current they are feed. You can burn out an LED with current, and the rating that matters to get full brightness is current. The resistor helps set the current.

There is a type of regulator that is called a constant current source. While most regulators regulate voltage, these regulate current. This makes them very useful for LEDs in general.
 
Last edited:

DickCappels

Joined Aug 21, 2008
10,152
THE ANSWER IS: It depends.

If you know the LED's characteristics -in other words, if you have the manufacturer's data sheet and,

if you see from the LED's I vs V curves that the maximum current would not be exceeded (including temperature extremes, device tolerances and battery tolerance),

Then it should be ok.

Otherwise, you won't know if it will keep the LED dark, work fine, or burn it out, and you should use Bill's circuit, in my opinion (seems everybody has one).
 

Wendy

Joined Mar 24, 2008
23,415
Heh, I had forgotten I posted those here. They are from my library, which I am building up over time. They are a 555 version of a joule thief. Thinking about it, any good joule thief circuit would work for you (google it).

If you want to try these circuits allow me to redraw them to something you might find more useful. What is the max current you could get from your power arrangement?
 

Audioguru

Joined Dec 20, 2007
11,248
Does anybody make a "3V" LED?
LEDs have a RANGE of forward voltage, not a fixed number like an incandescent light bulb.
A blue or white LED is typically 3.3V but could be as high as 3.8V even if they have the same part number and are made by the same manufacturer.
A "typical" blue or white LED with a voltage of 3.3V at 20mA barely lights at 3.0V. One with a higher forward voltage will not light at 3.0V.
A yellow LED has a range of 2.2V to about 3.0V.

Does anybody make a "3.0V" battery?
A new alkaline battery is 3.2V and a new lithium disposeable cell is 3.15V.
 

DickCappels

Joined Aug 21, 2008
10,152
Yes, many if not most LED manufacturers supply lower voltage LEDs, because for LEDs used for illumination, forward voltage drop is an important specification. The LEDs are binned during manufacturing in narrow voltage ranges. For example Refond RF-INRA30DS-EF white LED's lowest bin is 2.8 volts. Everlight GT3528/X2C-BXXXXXXXXX/2T series white LEDs are in 100 mv bins ranging from 2.7 to 3.6 volts at full current.

I don't like operating LEDs without resistors, except when being driven by a current source, but if one is careful and has good control of the components, one can make a reliable light that does not use resistors.
 

Thread Starter

wallaby

Joined Jul 26, 2011
34
Here is a diagram of what I have.
Right now I have a voltage regulator installed and set at 3volts, but by the time the RPMs give me an output of 3 volts I'm already pushing 1.5 amps.
As the RPMs go up the amperage climbs ... a practical ceiling is in the neighborhood of 5 amps, but it's impossible to tell how fast this thing can actually rotate when powered by the wind outside. My test bench data may give much higher numbers than it will ever see in real use.
If I go the route of using a constant current source, can I ditch the voltage regulator? I'm using a LM317 so it's just a matter of changing the wiring to set it up for current regulation.
Would I want to shoot for 120mA ? Or maybe 2 banks set a 60mA?

Really, I'm open for suggestion. The basic wiring I have may be a bad arrangement.(?)

windmill wiring.jpg
 
Top