Delta Sigma Basics

Discussion in 'General Electronics Chat' started by kdillinger, Feb 24, 2015.

  1. kdillinger

    Thread Starter Active Member

    Jul 26, 2009
    141
    3
    I need a refresher on the basics of delta sigma converters and none of the material has made it clear, to me anyway, about the relationship with resolution, frequency of sampling, and data rate.

    Suppose I have a 16 bit delta sigma ADC, a sampling clock at 100kHz and an oversampling ratio of 64. I want to calculate samples per second.

    A 16 bit converter requires 2^N clocks; 65536 clocks in this case. If my sampling frequency is 1ookHz and the OSR is set to 64 then I am sampling at 6.4MHz.

    Samples per second would be 6.4MHz/2^16 or 97.65 samples per second.

    Yes? No?
     
  2. WBahn

    Moderator

    Mar 31, 2012
    17,777
    4,804
  3. kdillinger

    Thread Starter Active Member

    Jul 26, 2009
    141
    3
  4. rogs

    Active Member

    Aug 28, 2009
    279
    37
    You probably need to think of a delta sigma ADC as a 1 bit converter, not 16 bit.

    See if THIS explanation of the concept is of any help - although, as you might imagine, it's describing the same principle as the previous suggestion
     
  5. nsaspook

    AAC Fanatic!

    Aug 27, 2009
    2,912
    2,177
  6. kdillinger

    Thread Starter Active Member

    Jul 26, 2009
    141
    3
    To clarify, with another example:
    An internal oscillator provides a clock source and is operating at 100kHz. A second order, 14 bit delta-sigma requires 2^14 clocks to provide a single answer (sample). The conversion time for that sample is 2^Nbits/FOSC and in this case that is 2^14/100kHz for 163.84ms. So that is 6.1 samples per second.
     
  7. Brownout

    Well-Known Member

    Jan 10, 2012
    2,375
    998
    I don't think that is correct. Here is a discussion about converter latency:

    http://www.ti.com/lit/an/slyt264/slyt264.pdf
     
  8. crutschow

    Expert

    Mar 14, 2008
    13,050
    3,244
    A delta-sigma converter takes a very large number of crude 1-bit samples of the signal and then digitally averages them to get a digital word.
    There is not a direct correspondence between the oversample rate and the number of bits in each averaged output word. The oversample rate is indirectly related to the number of resolution bits as higher resolution tends to require a higher oversample rate, but it's also related to the modulator order and other internal details of the converter.

    The oversample rate in a delta-sigma converter refers to the number of samples taken for each output word, not each output bit. The number of resolution bits is not used in that calculation.
    Thus, for an oversample rate of 64 samples/sec for a converter that outputs 100kHz word/sec, the signal sample rate is 6.4MHz (no bits involved).

    You example in Post #6 is incomplete since it does not state the oversample rate.
    How do you know that it takes one clock pulse per bit to generate the output word?
    You would need to know the clock pulses per bit sample which is likely much more than one.
     
Loading...