SAR and Delta Sigma ADC

Thread Starter

Gibson486

Joined Jul 20, 2012
355
I am kind of confused when reading more detail into these. I get the obvious differences (delta sigma digitally averages...SAR is 0 latency and faster...), but it is the examples I am having a hard time swallowing.

Every time I see a Delta sigma example, it is just a simple RC filter with the claim that you only use the bits you need. In other words, you really do not adjust your range to fit the your input of the ADC, but you simply just ignore the bits that are meaningless.

For a SAR, however, on the examples and comparisons, they always do a lot of signal conditioning, part of which is to adjust the input scale to the ADC.

My question is why are these examples at both extremes? The comparisons make SARs seem so laborious, when in reality, it is not. If I wanted to, couldn't I just signal condition to a delta sigma to give the whole range so I can get the whole 16 bits that I paid for? Most of the examples that compare the two seem to give me feedback that with a delta sigma, I can just move the MSB and LSB on the read end (ie in the micro-controller), but couldn't I do this with an SAR as well? At the end of the day, I paid for a 16 bit ADC, so I plan to use it as a 16 bit ADC. I get why SARs need more signal conditioning and buffering, but they make it seem like Delta sigmas need nothing more than an RC filter and you are done?
 

RichardO

Joined May 4, 2013
2,270
One reason the SAR type of A/D needs bandwidth limiting and a sample and hold circuit -- what you are calling signal conditioning -- is that the input _must not_ change even one LSB of value during the conversion or the result can be have a huge error.

A delta/sigma converter has a much higher intrinsic sample rate so the input bandwidth limiting is easier to do. Also, the sample and hold is sort of built into the delta/sigma converter.

I hope these quick comments help you. There are a lot of other differences but maybe this will get you started and lead you ask more questions.
 

crutschow

Joined Mar 14, 2008
34,428
Since the delta-sigma intrinsically digitally averages (filters) a high signal sample rate to get the digital output, the effects of noise are significantly reduces and only simple anti-aliasing noise filtering at its input is required.
SAR devices have no intrinsic sample averaging so are more sensitive to any input noise outside the desired signal bandwidth and generally require higher order filtering at their input to get the best noise and resolution performance.
But certainly you can amplify the input signal to a delta-sigma converter to get the best dynamic range, resolution, and signal/noise ratio at the output (although the very high resolution delta-sigma converters of over 20-bits reduce this conditioning requirement if you only need a lower resolution, such as 16-bits since you can just ignore the LSBs below 16 bits).
 
Top