ADC and sampling process

Thread Starter

Dadu@

Joined Feb 4, 2022
155
I am trying to understand ADC and sampling process in embedded system.

In electronics, an analog-to-digital converter (ADC) is a process that converts an analog signal, such as the sound picked up by a microphone, into a digital signal.

How is it decided in which time interval the sample is to be taken ?
 

MrChips

Joined Oct 2, 2009
30,824
I am trying to understand ADC and sampling process in embedded system.

In electronics, an analog-to-digital converter (ADC) is a process that converts an analog signal, such as the sound picked up by a microphone, into a digital signal.

How is it decided in which time interval the sample is to be taken ?
I am not sure that I understand the question.

However, there is something known as the sampling or Nyquist theorem. This says that the sampling rate must be at minimum, two times the highest input frequency. The corollary to that (meaning inverse application) that there must be no input frequency that is higher than ½ the sampling frequency.

For example, if you want to digitize a 10kHz sine wave, your sampling frequency must be 20ksps or greater (>20kHz).
Moreover, you must apply a low pass filter that attenuates everything above 10kHz (better start lower, e.g. 8kHz because no filter has a sharp cut off). This filter is called the anti-aliasing filter.

The importance of this filter is because if your sampling frequency is 20kHz, a 12kHz signal will fold back and be received as 8kHz (that is why it is called an alias).

Note that any wave shape that is not a sine wave will have frequencies that are higher than the fundamental. For example, a 10kHz square, triangular, and sawtooth wave will have very high frequencies much higher than 10kHz.
 

Thread Starter

Dadu@

Joined Feb 4, 2022
155
I am not sure that I understand the question.
I am trying to understand how analog signals like audio and temperature signals are converted to digital signals.

I am trying to understand these in terms of frequency, timing sampling rate and microcontroller.
 

MrChips

Joined Oct 2, 2009
30,824
I am trying to understand how analog signals like audio and temperature signals are converted to digital signals.

I am trying to understand these in terms of frequency, timing sampling rate and microcontroller.
Let's take one question at a time.
Are you asking how an ADC works?
How does an ADC convert analog to digital?
 

Ya’akov

Joined Jan 27, 2019
9,170
If an analog signal never changes, like a simple DC voltage that is constantly at 5V within the precision you can measure, then it doesn't matter how often you sample, it will be an accurate representation of the signal because it doesn't vary.

In real applications there is nothing like that. All the interesting signals vary in level. To decide how often you need to sample a signal you have to determine the maximum frequency of the changes to it. This maximum and the minimum sample rate to store an undistorted waveform are directly connected.

As @MrChips pointed out, this minimum sample rate is called the Nyquist rate (re-read his explanation for the details). It is twice the cycles per second your waveform changes. So a sine wave at 200Hz would require 400 samples per second in order to store an undistorted version.

The ADC itself has a certain number of bits of resolution, like 8, 12, or 16 and this represents the number of discrete levels that can be distinguished when sampling. The more bits the finer the distinction that can be made. You can make an ADC in a variety of ways but one is simply using resistors. Arranging resistors in a "ladder" which is just a series of progressively higher voltage dividers, each attached to a digital input can be used to sample an analog signal by checking the inputs at the sample rate and recording the one that is high at that time.

Other schemes are more complex and you should read about them, there is plenty of literature.
 

Ian0

Joined Aug 7, 2020
9,846
Sample at more than twice the highest frequency you are interested in.
That's why audio (20Hz-20kHz) is sampled at 44.1kHz or 48kHz.
But - and this is important - you must filter out all the frequencies above half the sampling frequency.
 

DickCappels

Joined Aug 21, 2008
10,187
The filter step depends on whether significant signals are present above the sampling frequency. For example, to capture voice the filter is helpful, but not particularly helpful when it comes to measuring the voltage across a battery.
 

mckenney

Joined Nov 10, 2018
125
Just to keep things lively, the term "sample" is thrown around freely, and can refer to (at least) three things:
1) The time that the ADC is actually connected to the signal, filling a sampling capacitor [properly: "sample/hold time"]
2) The sample/hold time plus the time the ADC spends computing the result from the capacitor [properly: "conversion time"]
3) (1)+(2)+the idle time between conversions [properly: "sampling period" or "1/sampling rate"] which is what Nyquist refers to.
Sometimes you have to figure out based on context.
[These assume a Successive-Approximation (SAR) ADC; they may apply only partially to other ADC types.]
 
Top