ADC + DAC Software Design

Thread Starter

ActivePower

Joined Mar 15, 2012
155
In an ongoing project, I need to write code to process incoming audio data at a sampling frequency of 8 kHz by adding a delayed element for some simple audio effects using a circular buffer.

As suggested in a previous thread, I was able to configure the ADC on my processor (LPC2148 - Datasheet) to trigger at the sampling interval of 125 us using a timer match bit built into the ADC. As I was just testing out all the peripheral functions first I used a simple polling routine for the purpose. I also configured and used the Timer interrupts for a toggle LED program. Now I need to put all the parts together and get it to work. For this purpose, what is the best way I could structure my code?

My current code design is somewhat like this:

Rich (BB code):
Main:
Configure ADC, DAC and Timer interrupt at sampling interval = 125 us
Configure circular buffer

ISR:
Read the ADC value
Write it into a buffer
Generate the DAC value (ADC_val + previous Buffer_val)
The buffer is 1024 element long, that gives a maximum delay of 128 ms. Is there any alternative/better way to write the same code?

Thanks.
 

ErnieM

Joined Apr 24, 2011
8,377
Is there any alternative/better way to write the same code?
Looks like you have the basics covered. Generally you do the very least inside the ISR, but that's not a hard and fast rule, especially when you only have one task to do. One app I did did most everything inside the ISR, the main loop just waited for the ISR to re-trigger.

I would access the buffer thru a pointer... So when you "Write it into a buffer" then "Generate the DAC value (ADC_val + previous Buffer_val)" this would be done via a pointer. By changing the offset between the two pointers you change the delay.

There should also be some routine to take care of incrementing the pointer by wrapping from 1023 back to zero, and to keep offsets inside the same range.
 
Top