Hey all,
I'm in a situation where the LED I have wants to drop more voltage than I have to provide.
When calculating the current-limiting resistor to go along with an LED, an assumption is made about the voltage drop across the LED. Am I correct in thinking that if I assume a lesser drop across the LED and fix my resistor to draw lesser current (at a point on the forward current vs forward voltage graph, ensuring that the voltage drop is high enough that the LED will still light), the LED will just be more dim?
So, in tangible format. I picked up a few 3.3v LEDs (http://www.avagotech.com/assets/downloadDocument.do?id=4152
), but my system is only running at 3.3v anyway. If I assume a drop of 3.0v, and pick a resistor to give the LED 8ma (as per figure 2 [page 5/8]), am I going to end up with .5 the intensity of running it at 3.3v@20ma (as per figure 3)?
Thanks a bunch!
I'm in a situation where the LED I have wants to drop more voltage than I have to provide.
When calculating the current-limiting resistor to go along with an LED, an assumption is made about the voltage drop across the LED. Am I correct in thinking that if I assume a lesser drop across the LED and fix my resistor to draw lesser current (at a point on the forward current vs forward voltage graph, ensuring that the voltage drop is high enough that the LED will still light), the LED will just be more dim?
So, in tangible format. I picked up a few 3.3v LEDs (http://www.avagotech.com/assets/downloadDocument.do?id=4152
), but my system is only running at 3.3v anyway. If I assume a drop of 3.0v, and pick a resistor to give the LED 8ma (as per figure 2 [page 5/8]), am I going to end up with .5 the intensity of running it at 3.3v@20ma (as per figure 3)?
Thanks a bunch!