I built a simple DC amplifier using a dual low noise AD708 Op Amp as 1.1 input buffer stage followed by an inverting amplifier. However, I am experiencing large variations in gain (0-1000) and offset (+/- 10 volts) for millivolt changes in power supply voltage. The DC amplifier has a maximum gain of 1000 (60 dB) with integration times variable from .001 second to 1 second. It uses a +/- 12 volt regulated power supply. I have measured the changes in gain and offset as a function of supply voltage. Keeping the negative voltage at 12 volts, I found that I must keep the positive supply voltage to within 100 millivolts in order to keep the gain constant. In my application, I can live with slowly varying offset but not with large gain variations. The questions I have is as follows. (1) Is it normal to find the large variations in gain and offset for high gain DC amplifiers? (2) Is there any way to reduce the gain variations through a better amplifier design, i.e., using an Instrumentation Op Amp or chopper Op Amp?