Delay of CMOS inverter using LTspice

Thread Starter


Joined Apr 21, 2012
Hello everyone. I am trying to see the delay of a CMOS inverter. So I built the inverter in LTspice. VDD is 3 V and the input is a square wave. But when I try to measure the delay, I don't get the result I am expecting. So I hope somebody can share some knowledge with me because I am not familiar with LTspice. Thank you!

Then these are the plots. The first one is the square wave input:

And this picture is measure at the drain between the inverter:

I was expecting some plot like this:


Joined Mar 31, 2012
I agree with Bill.

I've only looked at your schematic and these are my initial impressions:

I don't know LTSpice, so I am making some guesses here. But it looks like you are working with half micron transistors which are going to have a lot of resistance (maybe measured in kohms) and a 1uF cap, so your time constants are going to be on the order of milliseconds. Yet your input signal, if I am reading the description correctly, has a period of only 1ns. So I would expect your output voltage to slowly move toward whichever rail is connected to the transistor with the lower effective series resistance.

I'm assuming that your input signal source starts out at 0V and goes to 3V after a 50ps initial delay. If so, then the simulator is going to solve for the initial conditions with the inverter input LO, meaning that the initial voltage on the capacitor will be Vdd, or 3V.

Yep, sure enough. Your second plot shows about what I would expect if the NFET is slightly stronger than the PFET.

Why are you using a 1GHz squarewave with a 1uF load? To fully charge a 1uF capacitor to 3V in 0.5ns requires an average current of (1uF)(3V)/(0.5ns)=6000A. The peak current, of course, will be much higher. It also requires a correspondingly minuscule effective resistance from the FETs.

So take a step back and try to come up with something resembling rational device and simulation parameters.