Hi,
I'm trying to simulate a Common Source example from Sedra and Smith, 5th edition, page 293.
The model parameters used are: vto = 1.5, kp = 0.25m, lambda = 0.02
I notice that when I do the DC bias simulation, the numbers I get are matching hand calculations. That is:
ID = 1.06 mA
VDS = 4.4 V = VGS
ro = 47k
Av= -3.3
If I do an AC sweep w/ the input signal = 1 VAC amplitude, then I get Av = -3.50 or so (close to hand calculations). However, if I increase the amplitude to say 10 V or 100 V, then the output voltage is coming out to be over 346 V. This doesn't make much sense, obviously. Plus the transistor would have been out of the active region long ago when: vds < vgs - vt.
I'm just trying to figure out what results the simulation is giving me. I would think that the output voltage would plateau while operating in the triode region, not keep rising as the simulation is showing me. What am I doing wrong?
Thanks!
I'm trying to simulate a Common Source example from Sedra and Smith, 5th edition, page 293.
The model parameters used are: vto = 1.5, kp = 0.25m, lambda = 0.02
I notice that when I do the DC bias simulation, the numbers I get are matching hand calculations. That is:
ID = 1.06 mA
VDS = 4.4 V = VGS
ro = 47k
Av= -3.3
If I do an AC sweep w/ the input signal = 1 VAC amplitude, then I get Av = -3.50 or so (close to hand calculations). However, if I increase the amplitude to say 10 V or 100 V, then the output voltage is coming out to be over 346 V. This doesn't make much sense, obviously. Plus the transistor would have been out of the active region long ago when: vds < vgs - vt.
I'm just trying to figure out what results the simulation is giving me. I would think that the output voltage would plateau while operating in the triode region, not keep rising as the simulation is showing me. What am I doing wrong?
Thanks!
Attachments
-
55.1 KB Views: 31
-
49.3 KB Views: 34