Is there any advantage to pulsing IR LEDs at higher current vs. continuous? Will they appear "brighter" to the camera? For example, my LEDs (Vishay TSHG6410) list maximum ratings of:
Forward current =100 mA max
Peak forward current (tp/T = 0.5, tp = 100 µs) =200 mA (50% duty)
Surge forward current (tp = 100 µs) = 1 A (single pulse??)
and radiant intensity is:
IF = 100 mA tp = 20 ms Ie 90 mW/sr
IF = 1 A tp = 100 µs Ie 900 mW/sr
If I pulse 200mA at 50% duty, would this be any brighter than a continuous 100mA?
How about 1A at a 1% duty cycle (100µs on, 10Hz) or does this exceed the maximum?
Forward current =100 mA max
Peak forward current (tp/T = 0.5, tp = 100 µs) =200 mA (50% duty)
Surge forward current (tp = 100 µs) = 1 A (single pulse??)
and radiant intensity is:
IF = 100 mA tp = 20 ms Ie 90 mW/sr
IF = 1 A tp = 100 µs Ie 900 mW/sr
If I pulse 200mA at 50% duty, would this be any brighter than a continuous 100mA?
How about 1A at a 1% duty cycle (100µs on, 10Hz) or does this exceed the maximum?