Delta Sigma Basics

Thread Starter

kdillinger

Joined Jul 26, 2009
141
I need a refresher on the basics of delta sigma converters and none of the material has made it clear, to me anyway, about the relationship with resolution, frequency of sampling, and data rate.

Suppose I have a 16 bit delta sigma ADC, a sampling clock at 100kHz and an oversampling ratio of 64. I want to calculate samples per second.

A 16 bit converter requires 2^N clocks; 65536 clocks in this case. If my sampling frequency is 1ookHz and the OSR is set to 64 then I am sampling at 6.4MHz.

Samples per second would be 6.4MHz/2^16 or 97.65 samples per second.

Yes? No?
 

rogs

Joined Aug 28, 2009
279
Suppose I have a 16 bit delta sigma ADC, a sampling clock at 100kHz and an oversampling ratio of 64.
You probably need to think of a delta sigma ADC as a 1 bit converter, not 16 bit.

See if THIS explanation of the concept is of any help - although, as you might imagine, it's describing the same principle as the previous suggestion
 

Thread Starter

kdillinger

Joined Jul 26, 2009
141
To clarify, with another example:
An internal oscillator provides a clock source and is operating at 100kHz. A second order, 14 bit delta-sigma requires 2^14 clocks to provide a single answer (sample). The conversion time for that sample is 2^Nbits/FOSC and in this case that is 2^14/100kHz for 163.84ms. So that is 6.1 samples per second.
 

crutschow

Joined Mar 14, 2008
34,284
A delta-sigma converter takes a very large number of crude 1-bit samples of the signal and then digitally averages them to get a digital word.
There is not a direct correspondence between the oversample rate and the number of bits in each averaged output word. The oversample rate is indirectly related to the number of resolution bits as higher resolution tends to require a higher oversample rate, but it's also related to the modulator order and other internal details of the converter.

The oversample rate in a delta-sigma converter refers to the number of samples taken for each output word, not each output bit. The number of resolution bits is not used in that calculation.
Thus, for an oversample rate of 64 samples/sec for a converter that outputs 100kHz word/sec, the signal sample rate is 6.4MHz (no bits involved).

You example in Post #6 is incomplete since it does not state the oversample rate.
How do you know that it takes one clock pulse per bit to generate the output word?
You would need to know the clock pulses per bit sample which is likely much more than one.
 
Top