Signal to noise ratio interpretation in ADC context.

Thread Starter

Challenger

Joined Jan 17, 2017
11
Hi everyone
I would like to interpret the meaning of SNR when it comes to ADC input range.

I understand that is important to match the ADC dynamic range to the maximum signal amplitude for best accuracy.

I also understand that accuracy and dynamic range is a same thing, at least inside this context.

We all know that Dynamic Range (dB) = SNR (dB) = 20*Log10 (RMS Full-scale/RMS Noise).

According to that formula SNR is obtained from the "RMSFull-scale value" of the signal that drive the ADC.

Let say that we drive an 16 bit ADC with 1 vpp, Full Scale range is 1.024 v ( +/-512mv) and RMS noise is 20 uv.

1-What happen to the ADC output code accuracy when the driving signal is let say 0.5RMS FS or 0.25 RMS FS, will it be degraded ?

2-If the statement "accuracy and dynamic range is a same thing" is true, which is really the accuracy of such ADC ?

3-Is this a reason why you should use the voltmeter range that better match the amplitude to the signal you intend to measure to get max accuracy?

Thank you very much for your expert answer.
 
Top