Something I never understood ( I don't have an electrical degree). Resistors have a voltage drop. 1 resistor from 5 VCC to ground will have a 5V voltage drop because well it hits ground and has to have 5V potential on one end and 0 on the ground. 2 resistors in series will have 2 separate voltage drops.
If I have then a resistor in series with a mosfet, then why doesn't the resistor have a voltage drop? Is the gate of the mosfet such a high resistance that it doesn't matter? The resistor would obviously slow the current into the gate charge, but why isn't there a voltage drop? Or will the gate slowly get up to 5V over time once the gate capacitance is charged and fully stops electrical current, essentially becoming a very high value resistance?
If I have then a resistor in series with a mosfet, then why doesn't the resistor have a voltage drop? Is the gate of the mosfet such a high resistance that it doesn't matter? The resistor would obviously slow the current into the gate charge, but why isn't there a voltage drop? Or will the gate slowly get up to 5V over time once the gate capacitance is charged and fully stops electrical current, essentially becoming a very high value resistance?