gain stability over 1.25V-20V output range

Thread Starter

Harley Burton

Joined Jan 1, 2018
12
I'm attempting to convert a 12bit dac output to a range of 0-20V (only really care about 1.25-20). I have a reference voltage a 4.096V and I intend to use values 1250(dec) through 4000(dec) (1.25V - 4V) to get a voltage output from an old lm324 between 1.25 and 20V linearly. I need a near-perfect gain of 5, but I'm learning that op-amp gain doesn't seem to be stable across a range.

I've tried using a 5k 20-turn trim pot for my feedback loop for the op amp. If I tune it to give me a perfect 10.000V output with a 2.000V non-inverting input, and then set the input to 0.400V, I would expect to see an output of 2.00V. What I'm seeing is 1.94V output. If I set the input to 3.000V, I get 15.12V output. It isn't drift, because I can always go back to the input voltage that I used to calibrate, and the output is perfect.

I know there is a big difference between how we learn an op amp works and how one actually behaves, but I've never read anything about the gain being inconstant across a range, so I feel like I'm doing something wrong.

Is there a circuit that will give me consistent and accurate gain of 5 across those input and output ranges? I would prefer to use the lm324n because I have a box of them, but I can buy components if I have to. My supply line is also not set in stone. I'm working from a bench supply right now, and I plan to use a very good quality transformer that I have with 19.36V and 36V (6A) secondaries, and a low current 21.4V tap.

Thanks for any help.
 

crutschow

Joined Mar 14, 2008
34,470
Have you tried different LM324's from your stash to see if there's a difference.
LM324 is not a very good op amp and you may need to go with one that has better specs (especially input offset voltage).

Post your test circuit schematic.
 

ebp

Joined Feb 8, 2018
2,332
Input offset voltage is almost certainly the problem. The spec for the LM324 is ±2 mV typical and ±7 mV maximum. This offset is multiplied by the amplifier gain. The input bias current is fairly low, but bias current through input source resistance causes another voltage error term.
 

ebp

Joined Feb 8, 2018
2,332
I should know by now to say nothing whatever without an accurate schematic. Not only do we need a schematic, but we need to know how the circuit is built physically, which includes how the amplifier is connected to the DAC.

The error at what should be 2 V is only 60 mV. It is reasonably safe to assume the gain is close to the required 5, since it was set at "full scale", but offset voltage alone still only accounts for about half of the error. As long as there is offset, the only way to correctly set the gain is by a tedious iterative approach of tweaking for Δout/Δin.

If the iterative approach to setting gain is acceptable, and offset is the problem, the offset can be nulled by adding a constant to the value set by the DAC.

I would not even consider an LM324 for a precision application. The 324A is less imprecise, but still grossly inferior to a large range of other amplifiers. The amp is not the only issue. 12 bits is one part in 4000 or 250 ppm per count. That corresponds to a temperature change of only 2.5°C for a single typical ±100 ppm/°C 1% thick-film resistor. Use two resistors and the error could double (but typical performance is much better). If you want decent, stable performance you need not only a good op amp but resistors with low temperature coefficient of resistance. Suitable resistors will cost 3-4 times as much as an LM324 (the whole IC, not just one amp).
 

Thread Starter

Harley Burton

Joined Jan 1, 2018
12
Thank you both. I hadn't even considered the gain applied to input offset (still learning).

I found a better 324 in my stash, and the difference is stark. Must have had a bad one because the offsets didn't even make sense.

With another 324 that seems to be better, if I tune it at 10V, I get 20.122 from 4V in, and 0.920 with 0.2V in. Offsets/5 gives suspected input offsets of 24mV and 16mV respectively. Some of this could be breadboard parasitics etc, but even if everything was perfect, I don't want to have a 35mV offset. I would like to be able to set the values with a resolution of 5mV, and an output accuracy at least 10mV across the full range.

That being the case, what would be a better op amp (or another solution altogether) that I can get in a through-hole package? (not ready to deal with surface mount components)
 

crutschow

Joined Mar 14, 2008
34,470
After a short search, this is the cheapest I could find with low offset in a DIP package than could operate from a greater than 20V supply (which you need for a 20V output).
 

ebp

Joined Feb 8, 2018
2,332
Where are your input voltage values coming from - what you believe the DAC to be delivering or voltage actually measured at the amplifier input?

Keep in mind that connecting a meter introduces errors. If your DAC output is low impedance, the resistive loading of a meter is unlikely to cause error, but sometimes meter input capacitance can cause problems, plus you have that long lead that can inject noise. Sometimes using a resistor right at the meter probe tip can give you some clues. For example, a 1k resistor will go a long way to keeping noise and capacitive effects out of a low impedance circuit. It does, of course, introduce an error because of the meter's input resistance (typically 10 megohms for most general purpose DMMs; much higher on some ranges with higher-end meters). If you have two meters, watch the output of the amplifier with one while you connect the other to the DAC output. You should see no change or only very tiny change.

There may be voltage differences in the circuit that you don't realize are there, such as in the common ("ground") connection between the DAC circuit and the amplifier. If they very close together, this is unlikely, but if they are physically separated with connection wires between, there can be problems. This is why it is helpful to have not only a schematic but also some photos of how things are put together. You can use a meter to help find DC differences. For example, measure between the common of the amplifier at the end of the gain setting resistor and the common right at the DAC (probably IC "ground" pin). Measure both DC and AC. What you want to find is zero. What you may find is millivolts.

Be sure you have good power supply decoupling throughout. Noise on supply lines can manifest in strange ways and look like signal path errors. If you have an oscilloscope, look at everything, even if you are expecting DC. Sometimes you get some nasty surprises. Scopes aren't great for high precision, but they won't tell you that a square wave that swings from 0 V to 10 V is 5 volts, like a typical DC meter will.

Add a small capacitor (say 10 nF to 100 nF ceramic) across the feedback resistor on your amp, using the shortest connections possible (i.e. as close to the IC pins as you can get it). This will reduce the AC gain. It may be unacceptable in the final circuit because it slows response, but it may provide some clues in testing.
 

Thread Starter

Harley Burton

Joined Jan 1, 2018
12
Thank you. I wouldn't have asked you to look one up if you didn't know of one, but I appreciate it because it looks like what I'm asking for isn't common. That makes me wonder if I'm going about this all wrong to begin with.
 

Thread Starter

Harley Burton

Joined Jan 1, 2018
12
Where are your input voltage values coming from - what you believe the DAC to be delivering or voltage actually measured at the amplifier input?

Keep in mind that connecting a meter introduces errors. If your DAC output is low impedance, the resistive loading of a meter is unlikely to cause error, but sometimes meter input capacitance can cause problems, plus you have that long lead that can inject noise. Sometimes using a resistor right at the meter probe tip can give you some clues. For example, a 1k resistor will go a long way to keeping noise and capacitive effects out of a low impedance circuit. It does, of course, introduce an error because of the meter's input resistance (typically 10 megohms for most general purpose DMMs; much higher on some ranges with higher-end meters). If you have two meters, watch the output of the amplifier with one while you connect the other to the DAC output. You should see no change or only very tiny change.

There may be voltage differences in the circuit that you don't realize are there, such as in the common ("ground") connection between the DAC circuit and the amplifier. If they very close together, this is unlikely, but if they are physically separated with connection wires between, there can be problems. This is why it is helpful to have not only a schematic but also some photos of how things are put together. You can use a meter to help find DC differences. For example, measure between the common of the amplifier at the end of the gain setting resistor and the common right at the DAC (probably IC "ground" pin). Measure both DC and AC. What you want to find is zero. What you may find is millivolts.

Be sure you have good power supply decoupling throughout. Noise on supply lines can manifest in strange ways and look like signal path errors. If you have an oscilloscope, look at everything, even if you are expecting DC. Sometimes you get some nasty surprises. Scopes aren't great for high precision, but they won't tell you that a square wave that swings from 0 V to 10 V is 5 volts, like a typical DC meter will.

Add a small capacitor (say 10 nF to 100 nF ceramic) across the feedback resistor on your amp, using the shortest connections possible (i.e. as close to the IC pins as you can get it). This will reduce the AC gain. It may be unacceptable in the final circuit because it slows response, but it may provide some clues in testing.
Thanks for the input.
I'm working on a breadboard, so there's definitely parasitic capacitance all over the place, and I see reading change just based on where, on the breadboard, I put the ground connection from the supply, so yeah I will never get the level of accuracy that I want during this phase, but it looks like the components that I'm using aren't going to get me there in the end, which is what breadboarding is used to determine.

My little lab's gear isn't the greatest either, so I expect some error from that. The only meter that I have to show mV readings is an old Metra 18s; great meter in its day, but it has put on some years since it's last calibration.

If I can get things mostly working around the tolerances that I'm after on the breadboard, I can start working on some home-fab pcb designs to get it done.

Edit: I failed to answer your initial question. The measurements are coming from a Metra Hit 18S (mentioned above). It shows 4.095V at my 4.096V reference, so it seems to be pretty close (at least at that voltage range). I have a Tektronix TDS 210 scope and I've always complained about how far off it's voltage measurement is. I didn't realize that scopes tend to be off like that. I also have 2 bench multimeters that are old, but seem to be pretty good for up to 2 decimal places; Fluke 8010A and D811. Most of my gear is old but quality equipment that has thus held up well, but shows some age.
 
Last edited:
Top