MCU analalog input sensitivity

Thread Starter

mah

Joined Mar 15, 2010
393
i have analog signal ranging from 0 to 10 V and i want to input it to the microcontroller. but i think that the maximum input to MCU is 5 v so how to solve this?
 

Thread Starter

mah

Joined Mar 15, 2010
393
if i used a voltage divider to convert it to 0-5v, will it sense the 0.1 volt which means 0.2 before divider ?.
 

bertus

Joined Apr 5, 2008
22,278
Hello,

Yes, with a voltage divider of 2:1 the 10 volts becomes 5 volts and 0.2 volts becomes 0.1 volts.

Bertus
 

Thread Starter

mah

Joined Mar 15, 2010
393
what is the least value that could be read? could it read signal in mv i mean the values after 0 directly?
 

bertus

Joined Apr 5, 2008
22,278
Hello,

The lowest measurable voltage will depend on the number of bits in the ADC.
When having a full range of 5 Volts a 8 bit ADC will give you 5/256 = 19.53 mV
A 10 bits ADC will give you 5/1024 = 4.883 mV
A 12 bits ADC will give you 5/4096 = 1.221 mV.

Bertus
 

MikeML

Joined Oct 2, 2009
5,444
Depends on the # of bits in the ADC: 8bits=256 steps, 10bits = 1024 steps, 12 bits = 4096 steps.
If the ADC reference voltage is 5V (also full scale input), then 5/256, 5/1024, 5/4096 is the smallest step that the ADC can resolve.
 

nsaspook

Joined Aug 27, 2009
13,315
If you use a voltage divider be sure to meet the impedance spec for the ADC. If you're just reading a slowly changing signal or DC adding a low value filter cap on the pin to ground ref will do the trick but to read a fast changing signal accurately you should buffer the signal.
 
Top