Impedance and DC on an ADC input?

Discussion in 'General Electronics Chat' started by spinnaker, Dec 29, 2010.

  1. spinnaker

    Thread Starter AAC Fanatic!

    Oct 29, 2009
    4,887
    1,019
    Sorry for the newbie question but from what I remember about impedance is it is the "resistance" as it applies to AC.


    I have been working with PICs to measure DC voltages. The ADC requires a high impedance input and one way to do this is to connect a cap from the ADC input pin to ground.

    But I am measuring DC. Why is the cap required? Does it have something to do with the noise component on the DC? Or is it the fact the the DC level is actually varying a bit?
     
  2. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    You can use a good quality opamp configured as a unity gain buffer between your input and the ADC. That is what I do typically.

    hgmjr
     
  3. spinnaker

    Thread Starter AAC Fanatic!

    Oct 29, 2009
    4,887
    1,019

    Thanks my question was not how to do it but why I need it and how impedance applies to reading DC voltages.
     
  4. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    Sorry, I missed the question. The reason for imposing a buffer between your input signal and the ADC is that most microcontroller ADC inputs do not present an extremely high input impedance and so connecting your signal directly to the input may attenuate your signal depending on the output impedance of your signal source.

    hgmjr
     
  5. spinnaker

    Thread Starter AAC Fanatic!

    Oct 29, 2009
    4,887
    1,019
    Sorry again but how does impedance affect what is supposed to be a DC input? Impedance is supposed to be "resistance" to AC right?
     
  6. hgmjr

    Moderator

    Jan 28, 2005
    9,030
    214
    Impedance is a term that is often used interchangeably with resistance. In reality impedance encompasses both the real and imaginary resistance.

    Basically the output impedance of the signal source together with the impedance of the ADC input form a voltage divider. The voltage divider introduces an attenuation factor to the input signal. To minimize the attentuation factor, ideally you want to decrease the output impedance of the signal source and increase the impedance of the ADC input. Since you have no control over these two factors since they are what they are, the next best thing is to introduce a unity gain stage that presents a high impedance to the incoming signal and provides a low impedance for connection to the ADC. Thus the unity gain opamp buffer has the desired high input impedance and low output impedance needed.

    hgmjr
     
  7. Markd77

    Senior Member

    Sep 7, 2009
    2,803
    594
    In the simplest form, there is a capacitor in the ADC that is charged by the input. If the impedance / resistance of the source is too high then charging the capacitor changes the voltage that you are trying to measure.

    I think that with PICs the capacitor stays charged so with a high impedance source the second reading might be more accurate than the first, etc.
     
Loading...