Mosfet gate resistor voltage drop

Thread Starter

adam450

Joined Mar 19, 2019
34
Something I never understood ( I don't have an electrical degree). Resistors have a voltage drop. 1 resistor from 5 VCC to ground will have a 5V voltage drop because well it hits ground and has to have 5V potential on one end and 0 on the ground. 2 resistors in series will have 2 separate voltage drops.

If I have then a resistor in series with a mosfet, then why doesn't the resistor have a voltage drop? Is the gate of the mosfet such a high resistance that it doesn't matter? The resistor would obviously slow the current into the gate charge, but why isn't there a voltage drop? Or will the gate slowly get up to 5V over time once the gate capacitance is charged and fully stops electrical current, essentially becoming a very high value resistance?
 

Papabravo

Joined Feb 24, 2006
21,159
Because the MOSFET gate "looks like" a series RC circuit, you know from the solution to that 1st order differential equation that the current starts off at a high value and as the capacitor charges up exponentially the voltage drop declines exponentially from Vgate to zero. The thing that is hard to wrap your head around is that NOTHING is constant, unlike it is with DC circuits.

Check out this simulation: Green trace is the current in the resistor. Blue trace is the voltage drop across the resistor.

1642388848513.png
 
Last edited:

dl324

Joined Mar 30, 2015
16,846
The resistor would obviously slow the current into the gate charge, but why isn't there a voltage drop?
There is a voltage drop across the resistor while the gate capacitance charges. You can't measure it unless you have an oscilloscope.
Or will the gate slowly get up to 5V over time once the gate capacitance is charged and fully stops electrical current, essentially becoming a very high value resistance?
If 5 RC time constants can be considered slow.
 
Top