(Context: Class D amplifier)
In most example circuits, I see dead time implemented by adding a resistor/diode combination to the gate drive (after the gate driver), such that the "on" is delayed, while "off" happens quickly.
This works, but it seems that by having extra turn-on time, the MOSFET is not fully conducting for part of the time (as the voltage goes up), leading to substantially more power dissipation.
Adding delay to the logic (pre gate driver) is simple. Why isn't it always done this way? Why not add the R/D combination on the logic side to do the same thing without all that power loss?
There must be a reason, as almost all of the reference circuits I see show the delay added to the gate drive signal.
MOD NOTE: Links removed.
In most example circuits, I see dead time implemented by adding a resistor/diode combination to the gate drive (after the gate driver), such that the "on" is delayed, while "off" happens quickly.
This works, but it seems that by having extra turn-on time, the MOSFET is not fully conducting for part of the time (as the voltage goes up), leading to substantially more power dissipation.
Adding delay to the logic (pre gate driver) is simple. Why isn't it always done this way? Why not add the R/D combination on the logic side to do the same thing without all that power loss?
There must be a reason, as almost all of the reference circuits I see show the delay added to the gate drive signal.
MOD NOTE: Links removed.
Last edited by a moderator: