ADC Question

Thread Starter

RobD

Joined Dec 14, 2013
12
Hi all,

My question is, when I look through ADC datasheets, they seem to account for any form of varying positive voltage, but nothing about an AC input. How is an AC input converted to a serial interface?
Happy Holidays

Rob
 

crutschow

Joined Mar 14, 2008
34,450
Some ADC's will convert a plus or minus input so you can directly input an AC input. If they only handle positive voltages then you have to DC offset the AC voltage by 1/2 of the maximum input voltage for the ADC. In that case 1/2 full-scale on the output becomes the 0V point for the AC input. This offset can be done with series capacitor AC coupling or an op amp with offset added to the input.
 

Thread Starter

RobD

Joined Dec 14, 2013
12
Ah, thank you for the biasing suggestions! Will probably save me a good chunk of time there! Just as a bit of closure on the subject, if I were to use a 8bit ADC that could handle a + and - input, would the output of each polarity have the resolution of 4bits? As in 4bits for the negative voltages and 4bits for the positive voltages?
 
Top