If I want drive a MOSFET with an MCU, should I use resistors? I'm going to be using it with a heat gun, so high frequency, high power PWM is how I'm going to actually use it. I know that MOSFETs can have current spikes much higher than the MCUs are rated for but that they're for only for micro- or nanoseconds. Should I use like a 10k pulldown and maybe 100 ohm resistor to the MCU? Or should I use a transistor? Or something else altogether?
PS: also yes, I know I can get PWM controllers for fairly cheap, but I'm trying to do this with as little out of pocket as possible right now and I have resistors, MOSFETs, and some MCUs laying around.
PS: also yes, I know I can get PWM controllers for fairly cheap, but I'm trying to do this with as little out of pocket as possible right now and I have resistors, MOSFETs, and some MCUs laying around.