I'm still struggling with understanding basic transistor circuits -- I'm working on it, and I know from experience that one day it will 'click' and everything will fall into place -- hopefully before I grow too old and forget what transistors are
. However, I'm not there yet and yesterday this bit me when I tested a naive transistor switch circuit and it did not work. I think I have figured out why it does not work and how I can fix it. I'm posting this here to see if my understanding is correct, so please critique where necessary.
The circuit is a very basic transistor switch where a 5V logic signal needs to switch a 12V LED strip segment (three RBG LEDS in series). My attempt, the one that does not work, is naive simplicity itself:
Input is a 5V signal, output goes to the anode of the LEDs (they drop around 10V and draw 70mA doing so). I was stumped when I powered this up and nothing happened. After a bit of head scratching, I broke out the multimeter and started measuring some voltages on the transistor:
Vb = 4.2V
Ve = 3.6 V
Vc = 12V
Doh! Of course this does not work. The emitter sits one diode drop below the base, which in this case means 3.6V, far from the 12V needed for the LEDs. In fact I'm lucky I did not smoke the transistor, as it was dissipating over half a watt (current to the LEDs is 70mA).
This is what I mean by struggling with understanding transistor circuits - anyone with a better (or any
) understanding of transistors would have seen immediately that this will not work without having to actually try it.
Anyway, to get this to work I see two solutions.
- Option 1: take the output from the collector of the transistor and make it a sort of common emitter amplifier
This has two disadvantages. One, I would have to invert the logic of my software (this is part of a 4x4 matrix controlled by a microcontroller). That is not too much of a problem. However, the second problem might be a bit more serious. This is where I'm treading on even shakier ground, so please follow along with my calculations and correct where wrong.
Assuming I still want 11V on Vout (I tested the LEDs and they still work at that voltage) when the transistor is off, the resistor would need to be 1V / 0.07A or 70 ohm, and will dissipate 70mW. So far so good. Now, when the transistor is on, things get a bit hotter.
When the transistor is on, Vb is 4.2V and Ve is 3.6V (see above). Assuming the worst case scenario where the transistor is fully saturated (and this is the part I understand least - will it be saturated? ), Vc will also be 3.6V (actually a bit more, but we're assuming worst case). Rc will thus drop 12V - 3.6V = 8.4V . Dropping 8.4V across a 70 ohm resistor results in a current of 120mA and will dissipate just over 1W of power. So, in order to use this solution, I would need power resistors - not what I expected.
- Option 2: do what I should have done from the start and use FETs instead of bipolar junction transistors. At least I'm assuming that using a logic level FET would actually make this work - is that correct?
The circuit is a very basic transistor switch where a 5V logic signal needs to switch a 12V LED strip segment (three RBG LEDS in series). My attempt, the one that does not work, is naive simplicity itself:

Input is a 5V signal, output goes to the anode of the LEDs (they drop around 10V and draw 70mA doing so). I was stumped when I powered this up and nothing happened. After a bit of head scratching, I broke out the multimeter and started measuring some voltages on the transistor:
Vb = 4.2V
Ve = 3.6 V
Vc = 12V
Doh! Of course this does not work. The emitter sits one diode drop below the base, which in this case means 3.6V, far from the 12V needed for the LEDs. In fact I'm lucky I did not smoke the transistor, as it was dissipating over half a watt (current to the LEDs is 70mA).
This is what I mean by struggling with understanding transistor circuits - anyone with a better (or any
Anyway, to get this to work I see two solutions.
- Option 1: take the output from the collector of the transistor and make it a sort of common emitter amplifier

This has two disadvantages. One, I would have to invert the logic of my software (this is part of a 4x4 matrix controlled by a microcontroller). That is not too much of a problem. However, the second problem might be a bit more serious. This is where I'm treading on even shakier ground, so please follow along with my calculations and correct where wrong.
Assuming I still want 11V on Vout (I tested the LEDs and they still work at that voltage) when the transistor is off, the resistor would need to be 1V / 0.07A or 70 ohm, and will dissipate 70mW. So far so good. Now, when the transistor is on, things get a bit hotter.
When the transistor is on, Vb is 4.2V and Ve is 3.6V (see above). Assuming the worst case scenario where the transistor is fully saturated (and this is the part I understand least - will it be saturated? ), Vc will also be 3.6V (actually a bit more, but we're assuming worst case). Rc will thus drop 12V - 3.6V = 8.4V . Dropping 8.4V across a 70 ohm resistor results in a current of 120mA and will dissipate just over 1W of power. So, in order to use this solution, I would need power resistors - not what I expected.
- Option 2: do what I should have done from the start and use FETs instead of bipolar junction transistors. At least I'm assuming that using a logic level FET would actually make this work - is that correct?