I'm building a discrete op amp for a school project. There are a number of requirements for the circuit, but the immediately relevant ones are:
1. 3dB frequency response from DC-15MHz
2. Output range +/- 10V
3. gain of 1000-1500 (open loop)
Our professor has heavily implied that he's not necessarily expecting us to completely meet spec (e.g. lengthy in class discussions about the "good, fast, cheap, pick two" principle). However, I figure it's easier to ask for forgiveness than permission only if you can eliminate the more egregious errors in the circuit, like the one I'm having right now.
This is what I have right now, with a gain of ~1050 and meeting the frequency response requirements:
http://i51.tinypic.com/2wn1t1h.jpg
At low input voltages, the circuit seems to play relatively nice. These are the waveforms produced at an input of 0.001 V and 10Hz:
http://i56.tinypic.com/23phki.jpg
(blue is the output directly after the differential amplifier stage, while green is the output at the 50k load)
As the input voltage increases though, the output waveform shifts down, dramatically so. Here's the output at 0.007 V and 10Hz:
http://i52.tinypic.com/24zaio5.jpg
Comparing the peak-to-peak voltage of the output, the gain is still around 1050: however, the output seems to be centered around about -1.6V.
Circuit design has never been my strong point, so I'm stumped as to why the output is behaving this way. What am I not accounting for that's causing the output to shift as the input voltage rises?
1. 3dB frequency response from DC-15MHz
2. Output range +/- 10V
3. gain of 1000-1500 (open loop)
Our professor has heavily implied that he's not necessarily expecting us to completely meet spec (e.g. lengthy in class discussions about the "good, fast, cheap, pick two" principle). However, I figure it's easier to ask for forgiveness than permission only if you can eliminate the more egregious errors in the circuit, like the one I'm having right now.
This is what I have right now, with a gain of ~1050 and meeting the frequency response requirements:
http://i51.tinypic.com/2wn1t1h.jpg
At low input voltages, the circuit seems to play relatively nice. These are the waveforms produced at an input of 0.001 V and 10Hz:
http://i56.tinypic.com/23phki.jpg
(blue is the output directly after the differential amplifier stage, while green is the output at the 50k load)
As the input voltage increases though, the output waveform shifts down, dramatically so. Here's the output at 0.007 V and 10Hz:
http://i52.tinypic.com/24zaio5.jpg
Comparing the peak-to-peak voltage of the output, the gain is still around 1050: however, the output seems to be centered around about -1.6V.
Circuit design has never been my strong point, so I'm stumped as to why the output is behaving this way. What am I not accounting for that's causing the output to shift as the input voltage rises?