Discussion in 'Embedded Systems and Microcontrollers' started by mah, Apr 27, 2015.

1. ### mah Thread Starter Active Member

Mar 15, 2010
276
2
i have analog signal ranging from 0 to 10 V and i want to input it to the microcontroller. but i think that the maximum input to MCU is 5 v so how to solve this?

Apr 5, 2008
15,648
2,346
Hello,

Use a voltage divider.

Bertus

3. ### mah Thread Starter Active Member

Mar 15, 2010
276
2
if i used a voltage divider to convert it to 0-5v, will it sense the 0.1 volt which means 0.2 before divider ?.

Apr 5, 2008
15,648
2,346
Hello,

Yes, with a voltage divider of 2:1 the 10 volts becomes 5 volts and 0.2 volts becomes 0.1 volts.

Bertus

5. ### mah Thread Starter Active Member

Mar 15, 2010
276
2
what is the least value that could be read? could it read signal in mv i mean the values after 0 directly?

Apr 5, 2008
15,648
2,346
Hello,

The lowest measurable voltage will depend on the number of bits in the ADC.
When having a full range of 5 Volts a 8 bit ADC will give you 5/256 = 19.53 mV
A 10 bits ADC will give you 5/1024 = 4.883 mV
A 12 bits ADC will give you 5/4096 = 1.221 mV.

Bertus

7. ### MikeML AAC Fanatic!

Oct 2, 2009
5,450
1,066
Depends on the # of bits in the ADC: 8bits=256 steps, 10bits = 1024 steps, 12 bits = 4096 steps.
If the ADC reference voltage is 5V (also full scale input), then 5/256, 5/1024, 5/4096 is the smallest step that the ADC can resolve.

8. ### nsaspook AAC Fanatic!

Aug 27, 2009
2,908
2,168
If you use a voltage divider be sure to meet the impedance spec for the ADC. If you're just reading a slowly changing signal or DC adding a low value filter cap on the pin to ground ref will do the trick but to read a fast changing signal accurately you should buffer the signal.