I bought a bag of 100 white LED's from the Far East; they didn't cost very much, but they didn't come with any documentation except for a scrap of paper inside the bag with a notation that said, "2027" or "2D27."
I checked one for voltage drop and found it to be slightly over 3v. I checked it three times using a 9v battery: 1003Ω, 8.07v in & 3.06vF (.003a), 469Ω, 8.02v in & 3.15vF (.007a), and 356Ω, 7.97v in & 3.20vF (.009a). Documents I had read lead me to believe the voltage drop would be 3.5v.
I then tried to quickly change between the three resistors in an effort to see any visible difference in the brightness, but by the time I got one out and another in, I had lost my visual reference. So, I substituted a 1k linear taper pot, which proved a lot more educational. The change in the brightness of the LED was obvious as I adjusted the pot. At 1k, it was dim, but it brightened in what seemed to be a linear style as I decreased the resistance. (I thought about using a photocell to measure the brightness, but didn't have one on hand.)
From 1k down to 16Ω the brightness increased, but at about 16Ω the output from the LED dimmed and turned blue. I increased the resistance and the blue changed back to white, but the white wasn't as bright as before. Each time I did this, the LED seemed to decrease in output until it burned out completely.
I had thought that I might reach a point at which further increases in current would not produce increased brightness. Then I would have checked the resistance, calculated the current, and thereby determined the optimum resistance for maximum brightness. That turned out to be wrong.
So now I still don't know what current to run through these LED's for the best balance of output and life. I suppose I will revert to the 20ma rule of thumb...or does someone have a better suggestion?
By the way, I know this is "old news" to most of you, but I found it to be a good learning experience and wanted to share my findings.
I checked one for voltage drop and found it to be slightly over 3v. I checked it three times using a 9v battery: 1003Ω, 8.07v in & 3.06vF (.003a), 469Ω, 8.02v in & 3.15vF (.007a), and 356Ω, 7.97v in & 3.20vF (.009a). Documents I had read lead me to believe the voltage drop would be 3.5v.
I then tried to quickly change between the three resistors in an effort to see any visible difference in the brightness, but by the time I got one out and another in, I had lost my visual reference. So, I substituted a 1k linear taper pot, which proved a lot more educational. The change in the brightness of the LED was obvious as I adjusted the pot. At 1k, it was dim, but it brightened in what seemed to be a linear style as I decreased the resistance. (I thought about using a photocell to measure the brightness, but didn't have one on hand.)
From 1k down to 16Ω the brightness increased, but at about 16Ω the output from the LED dimmed and turned blue. I increased the resistance and the blue changed back to white, but the white wasn't as bright as before. Each time I did this, the LED seemed to decrease in output until it burned out completely.
I had thought that I might reach a point at which further increases in current would not produce increased brightness. Then I would have checked the resistance, calculated the current, and thereby determined the optimum resistance for maximum brightness. That turned out to be wrong.
So now I still don't know what current to run through these LED's for the best balance of output and life. I suppose I will revert to the 20ma rule of thumb...or does someone have a better suggestion?
By the way, I know this is "old news" to most of you, but I found it to be a good learning experience and wanted to share my findings.
Last edited: