LED Brightness VS Power/Current

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
I've noticed during my work with LEDs that you seem to get diminishing gains in brightness with rise in current/power. That is, I can put in less current than I can easily measure, and get a slight glow, and if I increase the current even slightly, the brightness will increase very quickly, but by the time the LED is almost fully lit, I can add 5 more mA and see a marginal change, even though I'm adding 33% more current (and probably closer to around 40% more power, because of the increased voltage as well). Do our eyes see light logarithmically, like how we hear (a sound must double in amplitude in order to rise a specific "loudness") or do you really get a diminishing return? Perhaps it's partially both. I'd like to see a graph of the efficiency of an LED over current/power if possible, and/or total light output over current/power. This could be relevant if you're trying to use LEDs as signals and power usage must be kept as low as possible. Another interesting question might be, which colors are the most efficient? I'd assume Red, because it seems to want to draw the same current to achieve full brightness, but at a lower voltage. Lastly, are super-bright LEDs a TON more efficient, or is it just me? I haven't done any measurements, but it seems that ultra-brights will output a ton more light at the same power.
 

Wendy

Joined Mar 24, 2008
23,421
Yes, newer LEDs are more efficient.

LEDs do not use voltage, but current. You can not force the voltage across a LED to change, except with things like temperature. It is current, and current only, that a LED responds to. And as you have noticed, they are notoriously nonlinear.

LEDs within the same batch vary their voltage drop somewhat. With various colors, it is quite dramatic.

LEDs, 555s, Flashers, and Light Chasers
 

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
Yes, newer LEDs are more efficient.

LEDs do not use voltage, but current. You can not force the voltage across a LED to change, except with things like temperature. It is current, and current only, that a LED responds to. And as you have noticed, they are notoriously nonlinear.

LEDs within the same batch vary their voltage drop somewhat. With various colors, it is quite dramatic.

LEDs, 555s, Flashers, and Light Chasers
Technically, LEDs respond to both voltage and current simultaneously, as they're directly related. LEDs respond VERY dramatically to any change in voltage, though, and is why we consider them to be current-controlled.

Anyways, my question was, how does brightness relate to current? Obviously there's a positive relationship, but does a linear increase in current cause a linear increase in brightness/total light output? And are different colors more/less efficient in total (in terms of power in vs light power out)?

Another thing I didn't touch on is the fact that, at LARGE over-currents, the frequency of the light changes and the brightness seems to decrease, so there must be a point in which you get the MOST actual light out, but I'm thinking that at that point the total efficiency is relatively low.

In summary, it seems like more of the power goes to light with the less overall power you give it.
 

bountyhunter

Joined Sep 7, 2009
2,512
I've noticed during my work with LEDs that you seem to get diminishing gains in brightness with rise in current/power.
Correct. Two reasons:

1) LEDs are most efficient at lowest current levels, as current increases carriers are banging into each other and interfering with emissivity.

2) The human eye is not linear. In other words, for a light source to LOOK twice as bright to you, it has to more than double it's true light output.
 

radiohead

Joined May 28, 2009
514
LEDs must have a current limiting resistor. Use the correct value for optimal life of the LED.

LED voltage minus supply voltage = voltage that must be dropped by the resistor.
Divide that voltage by the LED current to get the resistor value
Multiply that voltage by the LED current to get the resistor wattage value

If connecting LEDs in series, add all the LED working voltages then subtract from the supply voltage. This should not exceed 80% of the supply voltage for stable operation.
 

Wendy

Joined Mar 24, 2008
23,421
Technically, LEDs respond to both voltage and current simultaneously, as they're directly related. LEDs respond VERY dramatically to any change in voltage, though, and is why we consider them to be current-controlled.

Anyways, my question was, how does brightness relate to current? Obviously there's a positive relationship, but does a linear increase in current cause a linear increase in brightness/total light output? And are different colors more/less efficient in total (in terms of power in vs light power out)?

Another thing I didn't touch on is the fact that, at LARGE over-currents, the frequency of the light changes and the brightness seems to decrease, so there must be a point in which you get the MOST actual light out, but I'm thinking that at that point the total efficiency is relatively low.

In summary, it seems like more of the power goes to light with the less overall power you give it.
This is incorrect. A specific current at a specific temperature will cause a LED to drop a specific voltage, this is not a variable. It is current, not voltage, that is the key parameter. In this case LEDs are pretty stable, they can be used as voltage controllers, much like a zener. They are also spec'ed for a specific current, the voltage drop is given so you can calculate the resistance needed to create the needed current.

You can believe me or not, but you have this one wrong, completely. I am constantly seeing people try to regulate current on LEDs by controlling voltage. A lot of good parts die this way, because they are operating from an incorrect base theory.

Reading is your friend.
 

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
This is incorrect. A specific current at a specific temperature will cause a LED to drop a specific voltage, this is not a variable. It is current, not voltage, that is the key parameter. In this case LEDs are pretty stable, they can be used as voltage controllers, much like a zener. They are also spec'ed for a specific current, the voltage drop is given so you can calculate the resistance needed to create the needed current.

You can believe me or not, but you have this one wrong, completely. I am constantly seeing people try to regulate current on LEDs by controlling voltage. A lot of good parts die this way, because they are operating from an incorrect base theory.

Reading is your friend.
Given a specific voltage, LEDs will conduct a specific amount of current, but the relationship is very non-linear. You can't change the current going through an LED without changing the voltage going in (So, again, LEDs are actually controlled via both simultaneously. It's just semantics.), which is what the series resistors actually do. They act as voltage dividers with the series LED, so that the correct voltage is dropped across it even as the supply voltage changes, because LEDs don't have a constant resistance, the ratio of their resistances shifts in favor of the resistor as the supply voltage increases, meaning more and more of the voltage is dropped across it, and so overall the actual voltage across an LED (Or any other diode) changes very little with the change in supply voltage! It's much more complicated to think of it that way, but it's not incorrect. LEDs don't always drop a certain voltage regardless of current, but it's an OK approximation to make in practice. The reason so many people blow up LEDs trying to regulate them with a voltage supply is because an extremely small rise in voltage will cause a large increase in current, often destroying the component, so it's extremely difficult (and usually dumb) to do in practice.

I learned all this from one of my old posts here originally:
http://forum.allaboutcircuits.com/showthread.php?t=70009
 

Audioguru

Joined Dec 20, 2007
11,248
Your vision and hearing responses are logarithmic so you can see with moonlight or sunlight and hear a whisper or an extremely loud sound.
I am ignoring the iris in your eye that also adjusts its sensitivity to light.

9 or 10 times the light looks only twice as bright.
9 or 10 times the sound is twice as loud.
Attenuation is the same.

Two LEDs produce almost the same brightness as one but the total power is doubled.
Two speakers playing at the same volume produce almost the same loudness as one but the total power is doubled.

Cheap LEDs look bright because they have a dim old LED chip in a package that focusses the beam concentrating it into a narrow angle. Then you can't see it unless it points directly at you.
 

bountyhunter

Joined Sep 7, 2009
2,512
Given a specific voltage, LEDs will conduct a specific amount of current, but the relationship is very non-linear. You can't change the current going through an LED without changing the voltage going in (So, again, LEDs are actually controlled via both simultaneously. It's just semantics.),
No, it's not semantics it is electronics. The point is, the device powering the LED can only FORCE (control) either current or voltage, not both, and current is the one you regulate to hold brightness constant. If you feed an LED from a fixed voltage source it will blow up since it's VF drops as it gets hotter... ie, thermal runaway destruction.
 

bountyhunter

Joined Sep 7, 2009
2,512
(So, again, LEDs are actually controlled via both simultaneously. It's just semantics.), which is what the series resistors actually do. They act as voltage dividers with the series LED, so that the correct voltage is dropped across it even as the supply voltage changes, because LEDs don't have a constant resistance,
NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO

The series resistor is not a "voltage divider", it is a "quasi" current source since the voltage drop across the resistor is RELATIVELY constant compared to the LED.... assuming there is sufficient voltage drop allocated to the resistor's drop. That achieves a more stable current through the LED as the supply voltage varies.
 

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
No, it's not semantics it is electronics. The point is, the device powering the LED can only FORCE (control) either current or voltage, not both, and current is the one you regulate to hold brightness constant. If you feed an LED from a fixed voltage source it will blow up since it's VF drops as it gets hotter... ie, thermal runaway destruction.
Yeah, LEDs brightness is based on current, and the amount of current generated for any given voltage changes with temperature, but the point I'm trying to make is that, in the end, for any given temperature, a specific voltage will generate a specific current in the LED. If you do the math (and the math is crazy) and find what the voltage is across the actual LED when you have a resistor in series with it, and use that precise voltage on the LED without the resistor, the LED won't know the difference, and will glow with the same exact brightness.
 

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO

The series resistor is not a "voltage divider", it is a "quasi" current source since the voltage drop across the resistor is RELATIVELY constant compared to the LED.... assuming there is sufficient voltage drop allocated to the resistor's drop. That achieves a more stable current through the LED as the supply voltage varies.
the voltage drop across the resistor isn't constant at all, and, in fact, the voltage drop across the LED is the one that's relatively stable, not the drop across the resistor. Looking at diodes and seeing them as having resistance based on voltage is perfectly fine mathematically, and it is perfectly fine as a real-world model as well, it's just less intuitive. I'd suggest looking at the math and looking at some simulations, eventually I think you'll get my point. :)
 

Audioguru

Joined Dec 20, 2007
11,248
Cheap Chinese LED flashlights do not have a current-limiting resistor. They use the internal resistance of the battery to limit the current. Sometimes the LEDs survive.
 

Wendy

Joined Mar 24, 2008
23,421
the voltage drop across the resistor isn't constant at all, and, in fact, the voltage drop across the LED is the one that's relatively stable, not the drop across the resistor. Looking at diodes and seeing them as having resistance based on voltage is perfectly fine mathematically, and it is perfectly fine as a real-world model as well, it's just less intuitive. I'd suggest looking at the math and looking at some simulations, eventually I think you'll get my point. :)

Somehow I don't think you have designed many LED circuits, if you are using that convoluted logic.

Try designing from LED resistance. Oh wait! It is not a given parameter on the data sheet!

Fact is, you start from from an assumed LED voltage drop, then calculate the resistor. That resistor will work for every circuit made there after, even though LED voltage drop does vary dramatically among the same lot. Methinks you are arguing just to do so, and aren't really concerned with reality. Electronics is all about reality and physics. Theory is only useful if you need to explain a behavior, it is how you use it that matters.

Cheap Chinese LED flashlights do not have a current-limiting resistor. They use the internal resistance of the battery to limit the current. Sometimes the LEDs survive.
True enough. Most survive because LEDs in general are incredibly rugged, you won't get the full lifespan from them, the odds you will loose the cheap flashlight first.

Fact is, a constant current source is a better regulator for LEDs than a constant voltage source, and a constant voltage source is useful because it pins down a major variable if you are using a resistor to control LED current.
 
Last edited:

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
Somehow I don't think you have designed many LED circuits, if you are using that convoluted logic.

Try designing from LED resistance. Oh wait! It is not a given parameter on the data sheet!

Fact is, you start from from an assumed LED voltage drop, then calculate the resistor. That resistor will work for every circuit made there after, even though LED voltage drop does vary dramatically among the same lot. Methinks you are arguing just to do so, and aren't really concerned with reality. Electronics is all about reality and physics. Theory is only useful if you need to explain a behavior, it is how you use it that matters.

True enough. Most survive because LEDs in general are incredibly rugged, you won't get the full lifespan from them, the odds you will loose the cheap flashlight first.

Fact is, a constant current source is a better regulator for LEDs than a constant voltage source, and a constant voltage source is useful because it pins down a major variable if you are using a resistor to control LED current.
My logic isn't convoluted, and, in fact, is the ONLY WAY you can truly understand WHY LEDs react the way they do, and why it's a a pretty good approximation to take the supply voltage subtracted by Vf divide by resistance. I don't take things on faith, and you can't abandon Ohms law when it is convenient to. Ohms law doesn't understand subtraction, so why does it seem to work so well in this case? That's what I'm trying to answer for you.

LEDs DO have effective resistance, but that effective resistance changes with voltage. If you try and do the math, it get's incredibly difficult to find the exact value of this effective resistance when you have another resistor in series and a variable voltage supply, but it's there. Look at the problem in terms of a load-line, change the resistance and input voltage, see how the line reacts. Math doesn't lie, and you can NEVER escape voltage when you're talking about current, and vice versa, that's electronics 101.

In practice, of COURSE I use the approximation, and of COURSE I use something to give me consistent current, there's no reason not to, it's a quick, easy, and effective way to get what you want. HOWEVER, when you're trying to understand something on a deeper level, you have to strip away those layers of abstraction. The WHY matters to me just as much as the HOW.

In the end, I think you're just upset because I'm not OK with accepting the rule-of-thumb and moving on. I'm sorry, but I'm a natural-born scientist, and that just doesn't cut it for me.
 

cabraham

Joined Oct 29, 2011
82
Yeah, LEDs brightness is based on current, and the amount of current generated for any given voltage changes with temperature, but the point I'm trying to make is that, in the end, for any given temperature, a specific voltage will generate a specific current in the LED. If you do the math (and the math is crazy) and find what the voltage is across the actual LED when you have a resistor in series with it, and use that precise voltage on the LED without the resistor, the LED won't know the difference, and will glow with the same exact brightness.
Sorry but that is not how it works. I learned this in the university electronics lab around the mid 70's as an undergraduate EE major. If an LED is driven by a true current source, let's say 10 mA value, then the voltage is measured as 1.80V (red LED) at 25 deg C temp. Can we use a precise 1.80V voltage source to drive the red LED w/o a resistor? You seem to indicate we can, I say no way.

Placing a 1.80V constant voltage source across the LED will likely destroy it, even if said source is very low noise, free from transients. The Shockley diode equation which describes p-n junction devices is well known as follows:

Id = Is*(exp(Vd/Vt) - 1).

What is not well known is that "Is" is strongly dependent on temperature with a non-linear function. A small increase in temp results in a large increase in Is.

With voltage drive, the 1.80V is connected across the LED w/o a series resistor, the current will be as per the above equation. Current times the 1.80V is the power dissipated by the LED. The temperature rises due to power dissipation, & as does Is. But Vd is constant at 1.80V, so Id must increase since Is has increased. The power will increase as a result, further increasing Is, further increasing power. This is thermal runaway & the LED likely does not survive.

With current drive, say we connected the LED to a 10 mA current source. The voltage settles at 1.80V, & the power is 0.180 W. This results in a temperature increase. With current drive the voltage is given by:

Vd = Vt*(ln(Id/Is) + 1).

An increase in temp results in an increase in Is as well. But Is is in the denominator of the above equation so that the LED forward voltage drop Vd goes down. Thus the current remains constant but voltage goes down, so that power decreases. No thermal runaway takes place. This is why it is imperative that LED lamps always be current driven, never voltage driven.

A 10 mA current source holding 1.80V across an LED produces results much different from attempting to place a 1.80V voltage source across the same LED. Thermal stability is very good with current drive, atrocious with voltage drive. LED lamps must always be current driven, no debate on this at all. Voltage drive for LED lamps means certain doom.

If only a voltage source is available, a series resistor of sufficient value is nearly as good as true current drive. With a 5.00V voltage source, a forward LED voltage drop of 1.80V, & a 316 ohm resistor, the forward current is around 10 mA. The LED heats up & Is rises. But the current cannot snowball because as soon as current increases, the resistor drops a larger voltage & the LED lamp voltage drops. This is inherently thermally stable.

I had to point this out. I experimented in the 70's using your proposed method. I drove a diode with voltage source plus resistor, measured voltage drop across diode, then tried to place a voltage source of measured value directly across diode w/o a series resistor. My result - smoke!

Claude
 
Last edited:

Thread Starter

Austin Clark

Joined Dec 28, 2011
412
If you look at my previous posts, I pointed out that I was assuming constant temperature. I am aware that, in practice, it could be dangerous due to the heavy reliance of Vf on temperature, but beyond that everything I have said, I think you'll find, is correct.
 

Wendy

Joined Mar 24, 2008
23,421
And you will find many people, whom probably have a heck of lot more experience than you, disagree.

You are not the only one to come up with this stuff. The reason it doesn't gain traction is you are making fundamental assumptions that are wrong.

In a LED the resistance you are referring to is a useless parameter, it is even further off the map than Vf. It is nonlinear, and fundamentally unpredictable, and can not be used to limit current through a LED circuit. Many people have tried, and many LEDs have died. Calculating from Vf has immediate, real tangible benefits.

This is a teaching site. We have had folks come in with some off the wall theories and try to push them on beginners who are trying to learn how to use these components. Don't make that mistake.

I'm half way tempted to close this thread because of this, however, it is your thread, you can take it where you want. It has gone pretty far afield of your original question, to the point you are arguing points (not asking, but stating) that are pretty far off. What is the point? Is there an agenda being pursued, or do you have any questions left? You have rejected several explanations that are pretty well established for reasons unknown.
 

bountyhunter

Joined Sep 7, 2009
2,512
What is not well known is that "Is" is strongly dependent on temperature with a non-linear function. A small increase in temp results in a large increase in Is.
Hence the thermal runaway self destruct mode. That's why nobody EVER drives them from a voltage source.

Using a series resistor provides automatic negative feedback: as the current tries to increase, voltage drop across the resistor increases and reduces drop across diode.
 
Top