I have bought some led's with the following specs:
Color: Red
- Wave length: 620~625nm
- Direct voltage: 1.9~2.1V
- Counter current: < / = 10uA
- Brightness: 800~1000MCD
- Lighting angle: 30 degree
without really thinking much about it I used them in a project where I had 5V and a 22o ohm resistor for each. So that would give I = 3/220=13mA pr. led. All was fine...
...then I started a new project and randomly used a 470 ohm resister - and brightness was actually fine! so my question is: how can I know which current-value is appropriate for a led?
Color: Red
- Wave length: 620~625nm
- Direct voltage: 1.9~2.1V
- Counter current: < / = 10uA
- Brightness: 800~1000MCD
- Lighting angle: 30 degree
without really thinking much about it I used them in a project where I had 5V and a 22o ohm resistor for each. So that would give I = 3/220=13mA pr. led. All was fine...
...then I started a new project and randomly used a 470 ohm resister - and brightness was actually fine! so my question is: how can I know which current-value is appropriate for a led?

