Need help to understand effects of ADC gain and offset calibration

Thread Starter

floxia

Joined Mar 7, 2023
18
I have 24bit ADC (CS5530) with a Vref of 2.5V coming from a voltage divider, I have seen various commercial circuits that use this method and they all operate fine, not showing severe drift due to temperature variations on long term.

So far I have been focusing on getting everything to work correctly and I have been able to get stable readings with slight variations due to temperature changes, I am quite happy with the performance of my circuit, although after reading the datasheet (p. 21) in details I stumbled upon the calibration section, I have researched about it and found that system calibration could be useful to improve voltage reference drift. I do not intend to completely eliminate the drift but would be interesting to see if by simply calibrating the system periodically will result in a overall improvement of the circuit.

What is not entirely clear to me is how to perform the zero and full scale measurements of the ADC. For what I understand the zero scale calibration could be achieved by internally shorting the inputs of the ADC via software giving me 0V? I am although not sure how to perform the full scale calibration if the ADC differential inputs are connected.

Furthermore, will the full scale of the ADC be equal to the Vref?

I also wanted to mention that I am aware that there are solutions to acheive a low drift Vref but for the moment I would like to focus with what I currently have.

Thank you everyone, any info in regards will be greatly appreciated!
 

DickCappels

Joined Aug 21, 2008
10,236
For measuring the offset voltage, measure at the thing you are measuring. This might mean going off to another circuit. Remember that there may be thermal voltages present in you circuit as well as ground loops.

I am not familiar with that particular chip so I won't go further.
 

crutschow

Joined Mar 14, 2008
34,704
how to perform the zero and full scale measurements of the ADC.
For calibration you need a method to short the input (apply zero volts) to determine the offset, and then apply a known accurate voltage to determine the gain.
If this needs to be done when connected, then an analog mux can be used to switch between the normal signal source and the calibration source.
will the full scale of the ADC be equal to the Vref
Depends upon the ADC you are using.
 

MisterBill2

Joined Jan 23, 2018
19,054
For the full scale calibration, given that it is a very high resolution A/D converter, you will need to have a very accurate, very stable, and very very low noise calibration voltage source. Then you will need to have an adequate means to apply that voltage. By that I mean very low noise connection arrangement. I never had a need of resolution better than 16 bits, and most of the production testing was adequate with 12 bits. Does your application really need that great a resolution? Or is that to assure adequate dynamic range for your measurements?
 

MisterBill2

Joined Jan 23, 2018
19,054
Note that the A/D may have 24-bit resolution, but may not have 24-bit accuracy, where the LSB may be near the thermal noise of the signal.
THAT is part of why I asked about the needs of the process. Probably serious research needs higher resolution and accuracy that is not required or even useful in standard production.
 
Top