I've noticed during my work with LEDs that you seem to get diminishing gains in brightness with rise in current/power. That is, I can put in less current than I can easily measure, and get a slight glow, and if I increase the current even slightly, the brightness will increase very quickly, but by the time the LED is almost fully lit, I can add 5 more mA and see a marginal change, even though I'm adding 33% more current (and probably closer to around 40% more power, because of the increased voltage as well). Do our eyes see light logarithmically, like how we hear (a sound must double in amplitude in order to rise a specific "loudness") or do you really get a diminishing return? Perhaps it's partially both. I'd like to see a graph of the efficiency of an LED over current/power if possible, and/or total light output over current/power. This could be relevant if you're trying to use LEDs as signals and power usage must be kept as low as possible. Another interesting question might be, which colors are the most efficient? I'd assume Red, because it seems to want to draw the same current to achieve full brightness, but at a lower voltage. Lastly, are super-bright LEDs a TON more efficient, or is it just me? I haven't done any measurements, but it seems that ultra-brights will output a ton more light at the same power.