I have a working circuit that is a 0-99 counter via a push button, Arduino and a couple 4026 decade counters. This works just fine. I'm trying to make a custom 7-segment display from and LED strip. The LED strip is one of those that you can cut every so often. I've cut mine so that there are 6 LED on the segment. It looks like there are two resistors on the strip as well labeled 151. See attached picture.
So right now, I'm trying to disconnect one of the 7-segment outputs from the decade counter and run it through a transistor in order to turn on the LED strip. The LED strip runs on 12v and I measured the current when connected directly to 12v batter at 36 mA. I've attached a sketch of the circuit. The LEDs & R1s shown is my best guess at how that LED strip is wired together. The arduino side runs on about 5v and I have the LEDs connected to a separate 12v battery.
If I need 36mA to flow from collector to emitter, and from what I've read you can assume a gain of 100-300 for this transistor, then the required current on the base would be 36mA / 100 = 0.36mA. I measured the voltage on the output signal (from the decade counter) at 4.83v. Then calculating R2 I used [ 4.83v - 0.7v ] / 0.72mA = 5.7 k-ohms. I used 0.72mA instead of 0.36mA as I read somewhere you want to increase this by factor of 2 to ensure fully saturated for a switch.
When I connect everything the LED strip does not come on. I switched R2 to a few different value 5.6 kohm, 2.2kohm, and currently connected with a 1k for R2 and nothing has worked.
Any insight on what I'm doing wrong? I checked the transistor with a multimeter set on diode setting and with (-)ive on emitter and (+)ive on base I get a reading of 0.654v and (-)ive on collector and (+)ive on base I get 0.652v. Then between collector and base is no reading.
So right now, I'm trying to disconnect one of the 7-segment outputs from the decade counter and run it through a transistor in order to turn on the LED strip. The LED strip runs on 12v and I measured the current when connected directly to 12v batter at 36 mA. I've attached a sketch of the circuit. The LEDs & R1s shown is my best guess at how that LED strip is wired together. The arduino side runs on about 5v and I have the LEDs connected to a separate 12v battery.
If I need 36mA to flow from collector to emitter, and from what I've read you can assume a gain of 100-300 for this transistor, then the required current on the base would be 36mA / 100 = 0.36mA. I measured the voltage on the output signal (from the decade counter) at 4.83v. Then calculating R2 I used [ 4.83v - 0.7v ] / 0.72mA = 5.7 k-ohms. I used 0.72mA instead of 0.36mA as I read somewhere you want to increase this by factor of 2 to ensure fully saturated for a switch.
When I connect everything the LED strip does not come on. I switched R2 to a few different value 5.6 kohm, 2.2kohm, and currently connected with a 1k for R2 and nothing has worked.
Any insight on what I'm doing wrong? I checked the transistor with a multimeter set on diode setting and with (-)ive on emitter and (+)ive on base I get a reading of 0.654v and (-)ive on collector and (+)ive on base I get 0.652v. Then between collector and base is no reading.