SmallRedMachine

Joined Feb 25, 2017
48
Hi all,

I'm very new to PIC microcontroller (PIC18F452) and ADC/DAC. I am supposed to do ADC and moving average filter to an audio signal then DAC, now I have managed to do this(without the digital filter) with a simple code and got a lower quality audio from R 2R DAC but apparently we are supposed to use timer interrupts for ADC and I just cannot understand why ADC should be done inside timer interrupt, I have been researching for a few days and just cannot find an answer, what I believe I know so far:

MCU Clock: 20MHz ==> Tosc= 0.05us , 12 TAD needed for 10 bit conversion and minimum TAD time is 1.6us
0.05 * 32 = 1.6us ==> 32 Tosc is selected
1.6us * 12 = 19.2us ==> Sampling frequency= 1/19.2us = 52KHz

So I can have a timer interrupt with a set interval but I don't understand what is supposed to happen within each interrupt, does the whole 10 bit conversion happen in one interrupt ? why timer interrupt, why not just make a ADC function and call it in the main body of the program ? how my timer interrupt intervals should be related to my sampling frequency and TAD time ? I would also appreciate any book or source suggestion that I can get some hints from. I'm really frustrated and anxious because I have wasted so many hours trying to find some info on this and it just doesn't seem to exist anywhere... Thanks.

nsaspook

Joined Aug 27, 2009
7,486
Your sampling rate can't be smaller in time than the total ADC cycle time for one ADC acquisition (software overhead time to manipulate the data from the ADC on each interrupt) on this chip.

Generally the timer rate of interrupts sets the ADC sample rate. In the timer interrupt you might start the ADC conversion process and then exit the ISR. The ADC handler would have another ISR function that would trigger when the conversion is complete and then exit after its function is complete. This ISR might move the data from the ADC buffer to a main program buffer or process the conversion result before returning.

SmallRedMachine

Joined Feb 25, 2017
48
Your sampling rate can't be smaller in time than the total ADC cycle time for one ADC acquisition (software overhead time to manipulate the data from the ADC on each interrupt) on this chip.

Generally the timer rate of interrupts sets the ADC sample rate. In the timer interrupt you might start the ADC conversion process and then exit the ISR. The ADC handler would have another ISR function that would trigger when the conversion is complete and then exit after its function is complete. This ISR might move the data from the ADC buffer to a main program buffer or process the conversion result before returning.
If one ADC cycle is 19.2us according to the calculations I did then is it wrong to say the sampling rate is 1/19.2us= 52KHz? and is one ADC acquisition the same as the time taken for 1 bit to be converted ?
The A/D Conversion Clock Select bit in PIC is really confusing me now, isn't that the setting to chose the sample rate ?

nsaspook

Joined Aug 27, 2009
7,486
If one ADC cycle is 19.2us according to the calculations I did then is it wrong to say the sampling rate is 1/19.2us= 52KHz? and is one ADC acquisition the same as the time taken for 1 bit to be converted ?
The A/D Conversion Clock Select bit in PIC is really confusing me now, isn't that the setting to chose the sample rate ?
52KHz is a theoretical sampling rate with perfect computing (infinitely fast computer speed) but here we have a lowly PIC18 where the best sampling rate might be half or less than the theoretical highest. One ADC acquisition is the time for a complete x-bit (10-bits here) conversion.

SmallRedMachine

Joined Feb 25, 2017
48
52KHz is a theoretical sampling rate with perfect computing (infinitely fast computer speed) but here we have a lowly PIC18 where the best sampling rate might be half or less than the theoretical highest. One ADC acquisition is the time for a complete x-bit (10-bits here) conversion.

Alright so we already have this not ideal let's say 20KHz sampling rate but we use the timer rate of interrupt to set a sample rate of for example 60KHz, so now what happens to our 20KHz sample rate here ? is it overridden by the new one ?
So if I am understanding the process properly, ADC starts in timer interrupt, program executes the rest of the codes and by the time the timer overflows, ADC ends and a new ADC cycle starts again by the timer interrupt say about every 16us intervals for a 60KHz sample rate.
Thanks for responding.

nsaspook

Joined Aug 27, 2009
7,486
Alright so we already have this not ideal let's say 20KHz sampling rate but we use the timer rate of interrupt to set a sample rate of for example 60KHz, so now what happens to our 20KHz sample rate here ? is it overridden by the new one ?
So if I am understanding the process properly, ADC starts in timer interrupt, program executes the rest of the codes and by the time the timer overflows, ADC ends and a new ADC cycle starts again by the timer interrupt say about every 16us intervals for a 60KHz sample rate.
Thanks for responding.
You answer that question by thinking logically about how a sequence of timed events must happen.

SmallRedMachine

Joined Feb 25, 2017
48
You answer that question by thinking logically about how a sequence of timed events must happen.
I can't think what logically happen with having two different sampling rates, I believe the ADC completion flag would be triggered twice.
Was my understanding about the ADC process in the programming remotely close to what actually happens or do I have to rethink all of that ?

nsaspook

Joined Aug 27, 2009
7,486
I can't think what logically happen with having two different sampling rates, I believe the ADC completion flag would be triggered twice.
Was my understanding about the ADC process in the programming remotely close to what actually happens or do I have to rethink all of that ?
You only have one sampling rate and the conversion time (this is not a rate but it limits the sample rate) it takes to complete (plus the time to at least read the ADC sample into memory usually) one sample.

SmallRedMachine

Joined Feb 25, 2017
48
You only have one sampling rate and the conversion time (this is not a rate but it limits the sample rate) it takes to complete (plus the time to at least read the ADC sample into memory usually) one sample.
ok so the conversion time in my case is 19.2us and my ideal sampling rate of 52KHz would have a period of 19.2us but if in reality it's 20KHz then we have a problem, the timer interrupt is not really a second sample rate but since the start conversion code is inside it (here is the part I'm still thinking about-->)it sort of adjusts the sample rate. I hope I am in the right direction with this.

nsaspook

Joined Aug 27, 2009
7,486
The timer interrupt sets the sample rate but the hardware timer rate must be compatible with the time it takes to complete the conversion to have a good ADC sample data for each timer interrupt that starts the next conversion process.

Here the timer overflow interrupt ISR starts the conversion and sets the rate ADC data is buffered in the ADC ISR and then sent to a SD card by a process running in the main loop.

Joined Mar 10, 2018
4,057

1) Generally speaking one wants sample rate to have no jitter, which ISR driven
invocation to start ADC does, unfortunately, create. Interrupts generally
speaking non-deterministic. But many applications do not care about deterministic
sampling, sample to sample. Like reading the V of a battery, or a pot that controls
LED brightness. Small variations in "periodic" sampling in these examples not of
concern. There are positive and negative effects of both periodic and aperiodic
sampling, net is full of discussions on this topic.

There are processor families now that can do all this in HW, eg. run A/D con-
tinuously, and use DMA to update a variable in memory. Even ones with DSP
and DAC that can filter a sample stream and output back to DAC, all under HW
control. No code intervention.

2) A good practice with ISR is NOT to call other functions. Reason for this is
the amount of stack push that occurs. Additional time spent in ISR and thruput
reduced. The preferred method is set a global flag and exit. And use pointers
to any variable used that is not local, that also saves time. Makes ISR faster
response. You can evaluate this looking at your ASM listing for both approaches
to see what effect they have. Very instructive.

Regards, Dana.

Last edited:

SmallRedMachine

Joined Feb 25, 2017
48

1) Generally speaking one wants sample rate to have no jitter, which ISR driven
invocation to start ADC does, unfortunately, create. Interrupts generally
speaking non-deterministic. But many applications do not care about deterministic
sampling, sample to sample. Like reading the V of a battery, or a pot that controls
LED brightness. Small variations in "periodic" sampling in these examples not of
concern. There are positive and negative effects of both periodic and aperiodic
sampling, net is full of discussions on this topic.

There are processor families now that can do all this in HW, eg. run A/D con-
tinuously, and use DMA to update a variable in memory. Even ones with DSP
and DAC that can filter a sample stream and output back to DAC, all under HW
control. No code intervention.

2) A good practice with ISR is NOT to call other functions. Reason for this is
the amount of stack push that occurs. Additional time spent in ISR and thruput
reduced. The preferred method is set a global flag and exit. And use pointers
to any variable used that is not local, that also saves time. Makes ISR faster
response. You can evaluate this looking at your ASM listing for both approaches
to see what effect they have. Very instructive.

Regards, Dana.
Thanks for the infos, I'm still unclear about how the sampling frequency is set by the timer interrupt, the datasheet says the Acquisition time is 12.86us then when GO/~DONE is set it takes 19.2us (1.6us minimum and 32 Tosc is the conversion clock) for the conversion to be completed, so the total time taken is 32us, if I set my GO/~DONE in the ISR and set the timer overflow to happen every 32us then 10 bit is converted every 32us,that is a frequency of about 30KHz, now just as an example if I want the sampling frequency to happens at 1KHz, I would need to set the timer trigger to happen every 1ms so that that way sampling frequency would be 1KHz...

Joined Mar 10, 2018
4,057
The ADC uses a S/H in the front end to capture the sample
to be converted. Then the converter operates on that sample.
So the ISR that starts the conversion basically triggers the
S/H into hold mode. The fact that the converter takes numerous
clocks to do the actual conversion is irrelevant to the sampling
rate. Note, I have not looked at this specific architecture, if you
retrigger the start, while the ADC is doing the conversion, does
this terminate the current conversion and restart the ADC all
over again ? Something to investigate. Or does the converter
ignore the request and complete the conversion it is doing.Food
for thought.

http://www.tcnj.edu/~hernande/ELC343/Chapter_12.pdf

Regards, Dana.

mckenney

Joined Nov 10, 2018
86
now just as an example if I want the sampling frequency to happens at 1KHz, I would need to set the timer trigger to happen every 1ms so that that way sampling frequency would be 1KHz...
Yes. Every 1ms, you would start a new conversion, the ADC would run for (based on your numbers) 32us, and then sit idle for (1ms-32us) until the next timer event. You get 1 sample every 1ms, for a 1kHz sample rate. The ADC conversion speed hasn't got into your way.

If you increase the timer rate to 2kHz (0.5ms period), you would start the new conversion, it would run for 32us, then sit idle for (0.5ms-32us). The idle time is smaller, but still pretty big.

Continuing to increase the timer rate, you reach a point (somewhere around 1/32us=~30kHz) where the idle time goes negative -- at the moment of the timer event, the ADC hasn't finished. If you start a new conversion then, one of a few possible things might happen (check The Book), but I predict you won't like any of them.

That's the sense in which the timer sets the rate, but the ADC limits the maximum rate. In Real Life, there's also the time it takes you to (a) get to the ISR (b) capture the previous result (c) start the next conversion, during which the ADC is idle, and so effectively adds to the (32us) conversion time. So you won't even get 30kHz, but something a little lower.

Dana made a correct observation that there are applications where you don't care much about fixed-rate sampling. I would just point out that for audio you care very much, so it's worth allowing some padding (that "idle" time above), even at the cost of a lower sample rate, to make sure you never miss.

SmallRedMachine

Joined Feb 25, 2017
48
The ADC uses a S/H in the front end to capture the sample
to be converted. Then the converter operates on that sample.
So the ISR that starts the conversion basically triggers the
S/H into hold mode. The fact that the converter takes numerous
clocks to do the actual conversion is irrelevant to the sampling
rate. Note, I have not looked at this specific architecture, if you
retrigger the start, while the ADC is doing the conversion, does
this terminate the current conversion and restart the ADC all
over again ? Something to investigate. Or does the converter
ignore the request and complete the conversion it is doing.Food
for thought.

http://www.tcnj.edu/~hernande/ELC343/Chapter_12.pdf

View attachment 163969

Regards, Dana.
No it will not terminate the conversion as the conversion must end first and that's when GO/~DONE bit is cleared by hardware.
I have done more thinking and research and I think I should take a few steps back and think about what's really happening here on a very basic level as I must be misunderstanding something.
The microcontroller does 10 bit ADC conversion, that is 1024 discrete levels and for each level there is a need for one sample and hold. for the capacitor to be fully charged we need to allow approximately 13us, which is called the acquisition time. then it takes a minimum time of (depending on conversion clock setting) 1.6us for conversion. so in my understanding sample and hold happens 1024 time for 10 bit ADC.

I would appreciate a brief feedback on what I just explained as I need to know if I should rethink this again, Thanks.

nsaspook

Joined Aug 27, 2009
7,486
No it will not terminate the conversion as the conversion must end first and that's when GO/~DONE bit is cleared by hardware.
I have done more thinking and research and I think I should take a few steps back and think about what's really happening here on a very basic level as I must be misunderstanding something.
The microcontroller does 10 bit ADC conversion, that is 1024 discrete levels and for each level there is a need for one sample and hold. for the capacitor to be fully charged we need to allow approximately 13us, which is called the acquisition time. then it takes a minimum time of (depending on conversion clock setting) 1.6us for conversion. so in my understanding sample and hold happens 1024 time for 10 bit ADC.

I would appreciate a brief feedback on what I just explained as I need to know if I should rethink this again, Thanks.
NO, your understanding is incorrect on how the conversion happens. The ADC is a Successive Approximation A/D converter.
http://www.tcnj.edu/~hernande/ELC343/Chapter_12.pdf

SmallRedMachine

Joined Feb 25, 2017
48
NO, your understanding is incorrect on how the conversion happens. The ADC is a Successive Approximation A/D converter.
http://www.tcnj.edu/~hernande/ELC343/Chapter_12.pdf

I didn't mean to imply that SAR doesn't do the conversion, I forgot 1.6us is the conversion time per bit(T-AD) not per level according to the datasheet of PIC18F. Now about the sampling part if for 10 bit, 1024 sampling is required then that means 1024 * 13us for each time the capacitor needs to be charged but that seems too slow so I am most likely wrong again about it all.

Thanks everyone for the helps, I definitely know more now but I really need to find a very basic explanation in some book that I haven't found yet, any book I have looked at so far has explained it too general and generic.

Edit: 1024 levels, that affects the resolution, nothing to do with the number of samples, definitely I was talking nonsense...

Last edited:

SmallRedMachine

Joined Feb 25, 2017
48
Yes. Every 1ms, you would start a new conversion, the ADC would run for (based on your numbers) 32us, and then sit idle for (1ms-32us) until the next timer event. You get 1 sample every 1ms, for a 1kHz sample rate. The ADC conversion speed hasn't got into your way.

If you increase the timer rate to 2kHz (0.5ms period), you would start the new conversion, it would run for 32us, then sit idle for (0.5ms-32us). The idle time is smaller, but still pretty big.

Continuing to increase the timer rate, you reach a point (somewhere around 1/32us=~30kHz) where the idle time goes negative -- at the moment of the timer event, the ADC hasn't finished. If you start a new conversion then, one of a few possible things might happen (check The Book), but I predict you won't like any of them.

That's the sense in which the timer sets the rate, but the ADC limits the maximum rate. In Real Life, there's also the time it takes you to (a) get to the ISR (b) capture the previous result (c) start the next conversion, during which the ADC is idle, and so effectively adds to the (32us) conversion time. So you won't even get 30kHz, but something a little lower.

Dana made a correct observation that there are applications where you don't care much about fixed-rate sampling. I would just point out that for audio you care very much, so it's worth allowing some padding (that "idle" time above), even at the cost of a lower sample rate, to make sure you never miss.
Thanks a lot, this has really cleared me about the relationship between timer interrupt and conversion time and the limitations involved, can I just ask is it correct to interpret the figure attached to this post as saying the sampling time is 4 T-AD ? because 12 T-AD is the conversion time and that is clearly the case in the figure as well:

Attachments

• 12.5 KB Views: 4

mckenney

Joined Nov 10, 2018
86
can I just ask is it correct to interpret the figure attached to this post as saying the sampling time is 4 T-AD ? because 12 T-AD is the conversion time and that is clearly the case in the figure as well:
I rather prefer Figures 19-4 and 19-5 from the data sheet (39626e), which show the (settable) acquisition period, the hold event, and the 12 (not 8) SAR clocks, without getting too bogged in the terminology.