Hi all,
When I want to design an optocoupler circuit, I usually find the series resistor to limit the current for the opto led. If I want to add an external indicator such a red LED, I add it in series with the opto LED and find the proper resistor. My question comes from picture shown below, which is a circuit that I decoded from a PCB. The circuit is supposed to work with either 5V or 12V pulse. But I find this design pretty bad if I'm not mistaken. What would the overall voltage drop across both LED be? In fact, both LEDs are different in nature. Red LED has about 1.8V drop and opto led about 1.2V drop. The PS2501 opto has a max current of 80 mA. If I were the designer, I would put red LED in series with R1 and optoled and design for a current of 20 mA which is safe for both LED if the pulse is 5V. If the pulse were 12V, obviously both LED would experience very high current for a 5V design. To sum up, what exactly is happening when two LED different in nature are in parallel?
When I want to design an optocoupler circuit, I usually find the series resistor to limit the current for the opto led. If I want to add an external indicator such a red LED, I add it in series with the opto LED and find the proper resistor. My question comes from picture shown below, which is a circuit that I decoded from a PCB. The circuit is supposed to work with either 5V or 12V pulse. But I find this design pretty bad if I'm not mistaken. What would the overall voltage drop across both LED be? In fact, both LEDs are different in nature. Red LED has about 1.8V drop and opto led about 1.2V drop. The PS2501 opto has a max current of 80 mA. If I were the designer, I would put red LED in series with R1 and optoled and design for a current of 20 mA which is safe for both LED if the pulse is 5V. If the pulse were 12V, obviously both LED would experience very high current for a 5V design. To sum up, what exactly is happening when two LED different in nature are in parallel?