I'm building a circuit with a 24-bit ADC (Overkill, I know... but 10 is not enough and my instructor had a 24-bit ADC (cirrus5528) on hand). The CS5528 has two methods of calibration. Self calibration, and system calibration. For self calibration, all the calibration happens internally, but the gain calibration cannot be 100% accurate because the reference voltage is only 2.5V, and on a full 5V scale there can be errors (Not too sure exactly how this works, but this is what it says in the data sheet). Now, at 2.5V full scale, it can do the internal calibration. In order to get the most accurate gain calibration for a 5V full range scale, I need to apply 5V to the input pin of interest. To do this, I'm thinking I'll need a transistor? If I use an NPN transistor, controlled by a PIC, to put 5V on the PIN, will the voltage actually get to 5V on the PIC (aka, 0V drop over the transistor). Or do I need to tackle this with another method? I also need to connect ground for the offset calibration.
Also, I'm building this circuit at home. I can measure the voltage w/my little craftsman voltmeter, but I'm not sure about the accuracy, and the precision isn't very good (xx.xx in the 20V full scale range). I have access to better voltage meters at school, but I don't want to take my whole circuit there because it's on a breadboard and development board .
I appreciate your input and thanks!
Matt
Also, I'm building this circuit at home. I can measure the voltage w/my little craftsman voltmeter, but I'm not sure about the accuracy, and the precision isn't very good (xx.xx in the 20V full scale range). I have access to better voltage meters at school, but I don't want to take my whole circuit there because it's on a breadboard and development board .
I appreciate your input and thanks!
Matt
Last edited: