So I am trying to determine the amperage on a battery powered circuit. I have 3 AA batteries in series and when I check them with my new multimeter it shows 0.01-0.02 when set to read 10A, or 21.9 when set to read 200m. So in relation to A, what is it ? Is it really 100-200mA ? If it was 1 amp I would have expected 10A to show 0.10 (1/10 of 10) and 200m to read off the scale. I am trying to figure this out because I would like to replace the 3 batteries with a 5V USB power supply and need to figure out how much to step it down. This is my first digital meter and I want to make sure I am reading it correctly.
Thanks
Thanks