How can I make a MOSFET switch more sharp ?

Thread Starter

Marus780

Joined Jan 11, 2023
81
I want to make this circut to swich more sharp...
Now, the MOSFET begins to drive gradually as the voltage is slowly rising on the capacitor.
Can we make the MOSFET gate to see the voltage, only when it has raised above 6V ?
What I'm trying to do is to get a "high" level in the mosfet drain for 100ms, after the supply voltage is applied, and then drop sharp to "low" 0V.

screenshot.3.png screenshot.4.png
 
Last edited:

MrChips

Joined Oct 2, 2009
30,714
You cannot make this circuit switch fast with C1 connected to the gate.
You need a different circuit to separate the timing function from the fast switching requirement.
 

Papabravo

Joined Feb 24, 2006
21,159
Yes, but first tell me why you have C1 @ 10 μf. That is just shooting yourself in the foot if it has no purpose. Very fast switching of a MOSFET gate requires moving current on to and off of the gate. This is typically done with a Push-Pull driver like this:

1674080010980.png

A small value resistor R4 is used to damp the current spike on the rising and falling edge of the gate input signal. Notice that the gate will draw a fairly large current (≈ 300mA.) for a very short period of time to make this happen.
 
Last edited:

Thread Starter

Marus780

Joined Jan 11, 2023
81
Papabravo, because I need a start delay... It can be anything makes the MOSFET switch after 100ms...

Ian0, that circuit invert my signal...
 

Ian0

Joined Aug 7, 2020
9,672
Papabravo, because I need a start delay... It can be anything makes the MOSFET switch after 100ms...

Ian0, that circuit invert my signal...
You can drive another MOSFET from the drain of M2.
(You could even use a dual transistor such as BCM547B instead of the two MOSFETs)
 

Papabravo

Joined Feb 24, 2006
21,159
As @MrChips has pointed out you need to separate the two functions. There is Literally no reason why the MOSFET has to be associated with the timing function. Use a CMOS 555 in monostable mode followed by the MOSFET.
 
Last edited:

dcbingaman

Joined Jun 30, 2021
1,065
To reduce parts count to practically zero, just use a fixed period (100ms) power supervisor chip to drive the FET. In the following replace the microcontroller with your FET and select a fixed period 100ms supervisor chip that can push/pull the input to the FET. Or you can drop the FET altogether and just use the output of the supervisor chip to drive whatever it is you are trying to drive. This example is a 6V max one, you need to select one for 12V, this just shows the basic idea:

1674094249204.png
 
Last edited:

crutschow

Joined Mar 14, 2008
34,285
Ian0, that circuit invert my signal...
Here's Ian0's Schmitt-trigger circuit with the timing R and C reversed to give the desired output polarity:

Note that, due to the trigger point now being closer to the end of the RC time-constant curve, a smaller capacitor can be used for a given delay.

1674096040183.png
 
Last edited:

Thread Starter

Marus780

Joined Jan 11, 2023
81
This last variant of @crutschow is the only one which can I apply in my particular situation.
But I found that it works without that source resistor (R4). It is ok to build the circuit without it ? What role does it have ?
Thank you all !
 

crutschow

Joined Mar 14, 2008
34,285
I found that it works without that source resistor (R4). It is ok to build the circuit without it ? What role does it have ?
It provides hysteresis (small positive feedback) of the switch point (which makes it a Schmitt-trigger) to speed the output switching transition.

The simulation (below) shows a fall-time of about 7µs with the resistor (S1 open - green trace) and 100µs without (S1 closed - yellow trace).
If the transition without the resistor is fast enough for your purposes, then you don't need it.

1674162559707.png
 
Last edited:
Top