Hi I'm currently modeling a 5-12Vdc boost converter, and just out of nowhere decided to put a transistor between the output resistance and the diode. Now, this transistor is always connected to voltage so it shouldn't do anything at all, it's just a closed switch. What happens is that the transistor ends up eating a lot of the voltage. I don't really understand what's happening.
The output voltage in the resistance should be something like 12 volts and it ends up being like 2.7 volts. and the current goes from 14 mA to -3mA. I guess that's where the problem lies. The trasnsistor is doing something to the current. Might it be that the maximum current this transistor can withstand is less than the one i'm giving to the resistance?
Thank you.
The output voltage in the resistance should be something like 12 volts and it ends up being like 2.7 volts. and the current goes from 14 mA to -3mA. I guess that's where the problem lies. The trasnsistor is doing something to the current. Might it be that the maximum current this transistor can withstand is less than the one i'm giving to the resistance?
Thank you.