I am trying to create a circuit that will limit the current to LEDs in series connected to an AC supply. I have come up with something that simulates the way I want in CircuitLab, but I do not have enough experience to know if this will work in "real life". The goal is to be able to drive two sets of series LEDs, with each set in an opposite direction (this is easier to understand if you look at the attached schematic). This would not be difficult with series resistors and diodes, but I also want to allow the number and type of LEDs in each direction to be changed without having to change the circuit. In the attached example there are four LEDs in one direction and eight LEDs in the other. As drawn, this simulates as desired, with about 17mA of current going in each direction with a 24VAC supply. The worst case scenario would have three times as many LEDs in one direction as in the other.
My question is basically, will this actually work? The two transistors I chose were just the ones that CircuitLab had available. My guess is that these are not ideal, but I'm not sure what characteristics I need to look for. The current is quite low (<20mA), but I would actually like to scale this up to 120VAC, with a maximum of about 36 and a minimum of about 12 LEDs in each direction. Any thoughts at all would be greatly appreciated.
My question is basically, will this actually work? The two transistors I chose were just the ones that CircuitLab had available. My guess is that these are not ideal, but I'm not sure what characteristics I need to look for. The current is quite low (<20mA), but I would actually like to scale this up to 120VAC, with a maximum of about 36 and a minimum of about 12 LEDs in each direction. Any thoughts at all would be greatly appreciated.
Attachments
-
28.8 KB Views: 15