I am kind of confused when reading more detail into these. I get the obvious differences (delta sigma digitally averages...SAR is 0 latency and faster...), but it is the examples I am having a hard time swallowing.
Every time I see a Delta sigma example, it is just a simple RC filter with the claim that you only use the bits you need. In other words, you really do not adjust your range to fit the your input of the ADC, but you simply just ignore the bits that are meaningless.
For a SAR, however, on the examples and comparisons, they always do a lot of signal conditioning, part of which is to adjust the input scale to the ADC.
My question is why are these examples at both extremes? The comparisons make SARs seem so laborious, when in reality, it is not. If I wanted to, couldn't I just signal condition to a delta sigma to give the whole range so I can get the whole 16 bits that I paid for? Most of the examples that compare the two seem to give me feedback that with a delta sigma, I can just move the MSB and LSB on the read end (ie in the micro-controller), but couldn't I do this with an SAR as well? At the end of the day, I paid for a 16 bit ADC, so I plan to use it as a 16 bit ADC. I get why SARs need more signal conditioning and buffering, but they make it seem like Delta sigmas need nothing more than an RC filter and you are done?
Every time I see a Delta sigma example, it is just a simple RC filter with the claim that you only use the bits you need. In other words, you really do not adjust your range to fit the your input of the ADC, but you simply just ignore the bits that are meaningless.
For a SAR, however, on the examples and comparisons, they always do a lot of signal conditioning, part of which is to adjust the input scale to the ADC.
My question is why are these examples at both extremes? The comparisons make SARs seem so laborious, when in reality, it is not. If I wanted to, couldn't I just signal condition to a delta sigma to give the whole range so I can get the whole 16 bits that I paid for? Most of the examples that compare the two seem to give me feedback that with a delta sigma, I can just move the MSB and LSB on the read end (ie in the micro-controller), but couldn't I do this with an SAR as well? At the end of the day, I paid for a 16 bit ADC, so I plan to use it as a 16 bit ADC. I get why SARs need more signal conditioning and buffering, but they make it seem like Delta sigmas need nothing more than an RC filter and you are done?