ADC Calibration Circuit

Thread Starter

dksoba

Joined Jan 8, 2010
25
I'm building a circuit with a 24-bit ADC (Overkill, I know... but 10 is not enough and my instructor had a 24-bit ADC (cirrus5528) on hand). The CS5528 has two methods of calibration. Self calibration, and system calibration. For self calibration, all the calibration happens internally, but the gain calibration cannot be 100% accurate because the reference voltage is only 2.5V, and on a full 5V scale there can be errors (Not too sure exactly how this works, but this is what it says in the data sheet). Now, at 2.5V full scale, it can do the internal calibration. In order to get the most accurate gain calibration for a 5V full range scale, I need to apply 5V to the input pin of interest. To do this, I'm thinking I'll need a transistor? If I use an NPN transistor, controlled by a PIC, to put 5V on the PIN, will the voltage actually get to 5V on the PIC (aka, 0V drop over the transistor). Or do I need to tackle this with another method? I also need to connect ground for the offset calibration.

Also, I'm building this circuit at home. I can measure the voltage w/my little craftsman voltmeter, but I'm not sure about the accuracy, and the precision isn't very good (xx.xx in the 20V full scale range). I have access to better voltage meters at school, but I don't want to take my whole circuit there because it's on a breadboard and development board :(.

I appreciate your input and thanks!
Matt
 
Last edited:

beenthere

Joined Apr 20, 2004
15,819
There is so much stray capacitance and such on a breadboard that you may have a hard time getting stable results. You might only be able to get a useful 12 bit output.

Typically, you present the converter with 0 volts and the full scale voltage, and see if the output is 000000h and FFFFFFh (you have to assume linearity). If you have a precision voltage source and a 6 1/2 digit bench meter, you have a chance of actually coming up with the 5.00000 volt input - and that's not adequate for the precision of the converter.

24 bits gives 16,777,216 states. For a 5 volt input, the low order bit is equivalent to .0000003 volts. 300 nanovolts is hard to resolve.
 

rjenkins

Joined Nov 6, 2005
1,013
You will need to use a high quality anaog switch to swap the ADC input from the signal you are measuring to the 5V calibration reference.

You also need a 5V reference that is at least as accurate, tolerance wise, as the readings you want to take.

The preferred way of getting accurate readings with an ADC (where possible) is to use it in 'ratiometric' mode. This means using the same reference voltage for both the signal source (eg. a pot or transducer) and the ADC reference.

If you can do that, changes or drift of the reference voltage do not affect the reading accuracy.

This is a some useful info on ADCs, plus figure 8 & the text above cover the ratiometric setup.
http://www.maxim-ic.com/app-notes/index.mvp/id/748
 

Thread Starter

dksoba

Joined Jan 8, 2010
25
You will need to use a high quality anaog switch to swap the ADC input from the signal you are measuring to the 5V calibration reference.

You also need a 5V reference that is at least as accurate, tolerance wise, as the readings you want to take.

The preferred way of getting accurate readings with an ADC (where possible) is to use it in 'ratiometric' mode. This means using the same reference voltage for both the signal source (eg. a pot or transducer) and the ADC reference.

If you can do that, changes or drift of the reference voltage do not affect the reading accuracy.

This is a some useful info on ADCs, plus figure 8 & the text above cover the ratiometric setup.
http://www.maxim-ic.com/app-notes/index.mvp/id/748
Yes, I'm planning to use ratiometric mode. I think I can put my reference voltage as my +5V rail, and then use the "2.5V" mode, even though my reference is 5V.

I'll try that now I guess...

Edit: Tried that, works great. Well, using some high precision resistors (sort of, I guess, 0.1%), and making different voltage dividers, I get good results. I can calibrate internally at the 2.5V mode, even though 5V is my reference. The ADC can calibrate accurately in the 2.5V mode, according to the datasheet. Since my voltage divider and my reference voltage are running off a 5V rail, this will be a ratiometric mode. Also, I don't need 24 bits of precision, 12 or 13 bits is probably more than enough.

Matt
 
Top