I'm curious what level of accuracy people can generally expect from microcontroller ADC inputs.
I've been working on a current measurement project which gave me inconsistent readings, especially at the bottom end of the input range, so I did some tests to assess my ADC accuracy.
A few notes:
Although the actual amount of error is only a little worse at the bottom end of the scale than in other regions, the error as a percentage of the measured value gets ridiculous. Being off by >10mV on a 20mV signal is really bad!
For the moment, my lesson here is that I should not take any reading below 100-200mV very seriously! I'm curious what you all think of these error levels. Does this seem normal?
I've been working on a current measurement project which gave me inconsistent readings, especially at the bottom end of the input range, so I did some tests to assess my ADC accuracy.
A few notes:
- All of the measurements were done with a simple voltage divider trim pot as the test signal, not my current measurement circuit which could potentially have inconsistencies of its own.
- The Arduino was running 64x over sampling, theoretically delivering the equivalent of 13 bit resolution from the Uno's 10 bit converters. I did other tests and examined raw readings to confirm that the over sampling was working properly.
- The Arduino was powered with a 9V input, and the 5V internal power system (also used as Aref) measured exactly 5.00V every time I checked throughout the tests.
- External voltage measurements were performed with an Extech EX330 multi meter.
Although the actual amount of error is only a little worse at the bottom end of the scale than in other regions, the error as a percentage of the measured value gets ridiculous. Being off by >10mV on a 20mV signal is really bad!
For the moment, my lesson here is that I should not take any reading below 100-200mV very seriously! I'm curious what you all think of these error levels. Does this seem normal?