1-bit to PCM conversion in Delta-Sigma converters

Thread Starter

amundsen

Joined Aug 27, 2015
30
Hello,

I've read some papers about delta-sigma ADCs but I still don't understand what determines the resolution of the PCM samples at the output of the converter. Does increasing the oversampling factor result in a higher resolution for the same output sampling rate?

Thank you for helping.
 

Delta Prime

Joined Nov 15, 2019
1,311
Hello there :)
Does increasing the oversampling factor result in a higher resolution for the same output sampling rate?
No. I will explain if I may.
The following four limitations exist when sampling data.
Amplitude resolution
Amplitude range
Time quantization
Time interval
Amplitude resolution is the smallest change in input signal that can be distinguished. For example, we might specify the resolution as dX. Amplitude range is defined as the smallest to largest input value that can be measured. For example, we might specify the range as Xmin to Xmax. Amplitude precision is defined as the number of distinct values from which the measurement is selected. The units of precision are given in alternative or bits. If a system has 12-bit precision, there are 2^12 or 4096 distinct alternatives. For example if we use a slide pot to measure distance, the range of that pot might be 0 to 1.5cm. If there is no electrical noise and we use a 12-bit ADC, then the theoretical resolution is 1.5cm/4095, or about 0.0004 cm. In most systems, the resolution of the measurement is determined by noise and not the number of bits in the ADC. Time quantization is the time difference between one sample and the next. Time interval is the smallest to largest time during which we collect samples. If we use a 10-Hz SysTick interrupt to sample the ADC and calculate distance, the sampling rate, fs, is 10 Hz, and the time quantization is 1/fs=0.1 sec. If we use a memory buffer with 500 elements, then the time interval is 0 to 50 sec
 

Thread Starter

amundsen

Joined Aug 27, 2015
30
I might be wrong but I think delta-sigma converters function in a different way as they produce a 1-bit flow first, then convert this flow into a multi-bit (PCM) flow. The question is how does the conversion occur and how is the final PCM resolution determined.
 

bogosort

Joined Sep 24, 2011
696
I've read some papers about delta-sigma ADCs but I still don't understand what determines the resolution of the PCM samples at the output of the converter. Does increasing the oversampling factor result in a higher resolution for the same output sampling rate?
The output sample rate (the target rate) must be lower than the input sample rate. The general idea is this: after the conversion, we trade excess bandwidth (oversampling) for increased resolution. To get n extra bits of resolution requires an oversampling ratio of 4^n.

So, for example, to get one extra bit of resolution we increase the sample rate by a factor of 4. This has the effect of increasing the sampling bandwidth by a factor of 4: we're taking four extra samples per second. But since the signal's bandwidth hasn't changed, we don't actually need that extra bandwidth and we can do a sort of average on the four extra samples (called decimation) to produce one sample. Intuitively, we're using the extra information in the oversampled signal to produce a higher resolution output at the original rate.

Note that this doesn't have anything to do with the delta-sigma topology -- you can use oversampling/decimation with a SAR ADC to achieve the same thing. We tend to associate it with delta-sigma ADCs because oversampling is part of the design: use an oversampled low-bit modulator to produce a much longer PCM word. This process is even more efficient in delta-sigma converters because of the noise-shaping feedback that's built into the topology, which simultaneously acts like a low-pass filter to the signal (the sigma part is an integrator) and a high-pass filter (the delta part is a differentiator) to the quantization noise. By leveraging the properties of oversampling and noise-shaping, a delta-sigma ADC with a 5-bit modulator can produce a 24-bit PCM word.
 
Top