Hi all,
I'm very new to PIC microcontroller (PIC18F452) and ADC/DAC. I am supposed to do ADC and moving average filter to an audio signal then DAC, now I have managed to do this(without the digital filter) with a simple code and got a lower quality audio from R 2R DAC but apparently we are supposed to use timer interrupts for ADC and I just cannot understand why ADC should be done inside timer interrupt, I have been researching for a few days and just cannot find an answer, what I believe I know so far:
MCU Clock: 20MHz ==> Tosc= 0.05us , 12 TAD needed for 10 bit conversion and minimum TAD time is 1.6us
0.05 * 32 = 1.6us ==> 32 Tosc is selected
1.6us * 12 = 19.2us ==> Sampling frequency= 1/19.2us = 52KHz
So I can have a timer interrupt with a set interval but I don't understand what is supposed to happen within each interrupt, does the whole 10 bit conversion happen in one interrupt ? why timer interrupt, why not just make a ADC function and call it in the main body of the program ? how my timer interrupt intervals should be related to my sampling frequency and TAD time ? I would also appreciate any book or source suggestion that I can get some hints from. I'm really frustrated and anxious because I have wasted so many hours trying to find some info on this and it just doesn't seem to exist anywhere... Thanks.
I'm very new to PIC microcontroller (PIC18F452) and ADC/DAC. I am supposed to do ADC and moving average filter to an audio signal then DAC, now I have managed to do this(without the digital filter) with a simple code and got a lower quality audio from R 2R DAC but apparently we are supposed to use timer interrupts for ADC and I just cannot understand why ADC should be done inside timer interrupt, I have been researching for a few days and just cannot find an answer, what I believe I know so far:
MCU Clock: 20MHz ==> Tosc= 0.05us , 12 TAD needed for 10 bit conversion and minimum TAD time is 1.6us
0.05 * 32 = 1.6us ==> 32 Tosc is selected
1.6us * 12 = 19.2us ==> Sampling frequency= 1/19.2us = 52KHz
So I can have a timer interrupt with a set interval but I don't understand what is supposed to happen within each interrupt, does the whole 10 bit conversion happen in one interrupt ? why timer interrupt, why not just make a ADC function and call it in the main body of the program ? how my timer interrupt intervals should be related to my sampling frequency and TAD time ? I would also appreciate any book or source suggestion that I can get some hints from. I'm really frustrated and anxious because I have wasted so many hours trying to find some info on this and it just doesn't seem to exist anywhere... Thanks.