ADC Question

Discussion in 'General Electronics Chat' started by RobD, Dec 23, 2013.

  1. RobD

    Thread Starter New Member

    Dec 14, 2013
    12
    0
    Hi all,

    My question is, when I look through ADC datasheets, they seem to account for any form of varying positive voltage, but nothing about an AC input. How is an AC input converted to a serial interface?
    Happy Holidays

    Rob
     
  2. GopherT

    AAC Fanatic!

    Nov 23, 2012
    6,050
    3,813
    Do you have a datasheet for a random part that we can use as the basis for our discussion?
     
    RobD likes this.
  3. RobD

    Thread Starter New Member

    Dec 14, 2013
    12
    0
  4. crutschow

    Expert

    Mar 14, 2008
    13,014
    3,234
    Some ADC's will convert a plus or minus input so you can directly input an AC input. If they only handle positive voltages then you have to DC offset the AC voltage by 1/2 of the maximum input voltage for the ADC. In that case 1/2 full-scale on the output becomes the 0V point for the AC input. This offset can be done with series capacitor AC coupling or an op amp with offset added to the input.
     
    RobD likes this.
  5. RobD

    Thread Starter New Member

    Dec 14, 2013
    12
    0
    Ah, thank you for the biasing suggestions! Will probably save me a good chunk of time there! Just as a bit of closure on the subject, if I were to use a 8bit ADC that could handle a + and - input, would the output of each polarity have the resolution of 4bits? As in 4bits for the negative voltages and 4bits for the positive voltages?
     
  6. eeabe

    Member

    Nov 30, 2013
    59
    9
    You would get about 7 bits plus a sign bit, just like an 8 bit signed number that ranges from -128 to 127.
     
    RobD likes this.
  7. RobD

    Thread Starter New Member

    Dec 14, 2013
    12
    0
    Ahhhhhhh, this makes much more sense, thank you kind sirs! Happy holidays!

    Rob
     
Loading...