LED Destructive Testing

Thread Starter

tracecom

Joined Apr 16, 2010
3,944
I bought a bag of 100 white LED's from the Far East; they didn't cost very much, but they didn't come with any documentation except for a scrap of paper inside the bag with a notation that said, "2027" or "2D27."

I checked one for voltage drop and found it to be slightly over 3v. I checked it three times using a 9v battery: 1003Ω, 8.07v in & 3.06vF (.003a), 469Ω, 8.02v in & 3.15vF (.007a), and 356Ω, 7.97v in & 3.20vF (.009a). Documents I had read lead me to believe the voltage drop would be 3.5v.

I then tried to quickly change between the three resistors in an effort to see any visible difference in the brightness, but by the time I got one out and another in, I had lost my visual reference. So, I substituted a 1k linear taper pot, which proved a lot more educational. The change in the brightness of the LED was obvious as I adjusted the pot. At 1k, it was dim, but it brightened in what seemed to be a linear style as I decreased the resistance. (I thought about using a photocell to measure the brightness, but didn't have one on hand.)

From 1k down to 16Ω the brightness increased, but at about 16Ω the output from the LED dimmed and turned blue. I increased the resistance and the blue changed back to white, but the white wasn't as bright as before. Each time I did this, the LED seemed to decrease in output until it burned out completely.

I had thought that I might reach a point at which further increases in current would not produce increased brightness. Then I would have checked the resistance, calculated the current, and thereby determined the optimum resistance for maximum brightness. That turned out to be wrong.

So now I still don't know what current to run through these LED's for the best balance of output and life. I suppose I will revert to the 20ma rule of thumb...or does someone have a better suggestion?

By the way, I know this is "old news" to most of you, but I found it to be a good learning experience and wanted to share my findings.
 
Last edited:

SgtWookie

Joined Jul 17, 2007
22,230
Yep, that was valuable experience, and I'm sure that more people will learn from your experience.

The light intensity of an LED closely correlates to the current flow through the LED, up to a point. At some point, it will take a good bit more current to increase the brightness - at the same time, you may notice a subtle change in the color of light emitted. When you start to see the color change, you know you're putting too much current through the LED.

Most typical "super bright" LEDs nowadays will take 20mA current safely. The easy way to find out their Vf is to use an LM317 regulator with a 62 Ohm resistor connected from the OUT to the ADJ terminal.

You can then supply the regulator with +V to the IN terminal, and place the LED between the ADJ terminal and -V. I suggest using 8v to 12v DC for the supply.

Note that an LM317 will drop a minimum of 3v across itself when used as a current regulator.
 

Markd77

Joined Sep 7, 2009
2,806
You could set up 2 side by side at the maximum current you think they can take, disconnect 1 and leave the other one on for a day. Then connect the first one back up and compare brightness. Make sure nothing can catch fire if it all goes wrong, and then add in an extra safety margin for when the ambient temperature is higher.
 

Thread Starter

tracecom

Joined Apr 16, 2010
3,944
I checked one for voltage drop and was surprised to find about a 5v drop. I checked it three times using a 9v battery: 1003Ω, 8.07v in & 3.06vF (.005a), 469Ω, 8.02v in & 3.15vF (.010a), and 356Ω, 7.97v in & 3.20vF (.013a). Documents I had read lead me to believe the voltage drop would be 3.5v.
As you can tell from the quote above, I measured the voltage into the LED and the voltage out of the LED and subtracted. Was I incorrect in the method of measuring voltage drop? If I simply measure the voltage from one LED leg to the other, I get about 3v. Now I'm confused. Which is the correct way to find the voltage drop? (Stupid question, I know.)
 

retched

Joined Dec 5, 2009
5,207
You were correct. Voltage before and after.

Using a battery, the neg probe on -. The pos probe, check the anode and then the cathode. Subtract. That is the voltage drop.
Well with the meter on V. ;)
 

Thread Starter

tracecom

Joined Apr 16, 2010
3,944
You were correct. Voltage before and after.

Using a battery, the neg probe on -. The pos probe, check the anode and then the cathode. Subtract. That is the voltage drop.
Well with the meter on V. ;)
Thanks.

Attached is my circuit with voltage measurements annotated. Is the voltage drop 3.03v? That is what is measured across the LED and also the difference between 7.95v and 4.92v.
 

Attachments

Thread Starter

tracecom

Joined Apr 16, 2010
3,944
So, now I want to make sure I am using the correct semantics and terminology. Referring to the schematic in Post 7, which of the following are incorrect (if any.) Please provide a corrected version of any that are wrong.

1. The LED has a 3.03 volt drop.
2. The LED drops 3.03 volts.
3. There is a 3.03 volt drop across the LED.
4. The LED is dropping 3.03 volts.
5. The vF of the LED is 3.03 volts.
6. The resistor has a 4.92 volt drop.
7. The resistor drops 4.92 volts.
8. There is a 4.92 volt drop across the resistor.
9. The resistor is dropping 4.92 volts.

I know this seems pedantic, but I want to get it right. Thanks.
 

bertus

Joined Apr 5, 2008
22,270
Last edited:

Thread Starter

tracecom

Joined Apr 16, 2010
3,944
Regarding the circuit attached to Post 7 in this thread, I inserted my DVM in series with the circuit to measure the current. The measurment is 4.91ma, but the calculation indicates only 3ma (3.03v / 1003 ohms = .003a.) What am I doing wrong?
 

retched

Joined Dec 5, 2009
5,207
Well, some meters are not very accurate at such low currents.

Pull the resistor and meter it. Most resistors have a 5% to 10% tolerance or more. meaning a 100ohm could actually be a 110ohm resistor.

So, meter the resistor and do the math again. If it is right, good.
If not, it may be the meter. Check the specs for the tolerances on low currents.
 

Thread Starter

tracecom

Joined Apr 16, 2010
3,944
Well, some meters are not very accurate at such low currents.

Pull the resistor and meter it. Most resistors have a 5% to 10% tolerance or more. meaning a 100ohm could actually be a 110ohm resistor.

So, meter the resistor and do the math again. If it is right, good.
If not, it may be the meter. Check the specs for the tolerances on low currents.
I measured the resistor (with the same DVM) and it measures 1003 ohms. The DVM may be the problem; it's not a Fluke. But at any rate, is the logic of my calculations correct?
 

Markd77

Joined Sep 7, 2009
2,806
At low currents the resistance of the meter itself means that it reduces the voltage available to the circuit slightly. This reduces the current in the circuit. At higher currents the meters small resistance has less effect.
 
Top