# Basic of Real-Time Spectrum Analyzer

#### Kittu20

Joined Oct 12, 2022
482
I am looking for help to understand the basics of real time spectrum analyzer developed by microcontroller.

A real-time analyzer is a system that displays the frequency spectrum of an audio signal.

This figure gives me a good start but still I have some doubts in my mind which are not clear.

I think the microcontroller takes audio input and converts it to digital samples. Each sample store into buffer memory. The frequency of audio signal can be calculated using Fast Fourier Transform. And system displays the frequency spectrum of an audio signal.

We have three tasks ADC sampling, FFT Operation and LCD Update to perform in the system,

I am trying to understand how the microcontroller perform these three tasks in system.

I am not sure if I am understanding the correct concept.

Does microcontroller takes a one sample, calculates its frequency , and then displays it on the screen and repeat all process continuously from start?

#### WBahn

Joined Mar 31, 2012
30,234
Let's say that the microcontroller takes a sample and it comes out to be 1482. What's the frequency of that sample?

It's a meaningless question.

The system has to collect enough data points on which to calculate an FFT. How many data points depends on the desired resolution of the spectral plot. It then plots the result. At that point it then takes another batch of samples.

There are a number of ways to do this based on how fast you want the plot to update verses the resolution.

#### Kittu20

Joined Oct 12, 2022
482
The system has to collect enough data points on which to calculate an FFT. How many data points depends on the desired resolution of the spectral plot. It then plots the result. At that point it then takes another batch of samples.

What "enough data points" means how many data points should be collecte for FFT operation

Last edited:

#### MrChips

Joined Oct 2, 2009
30,925
You are presenting random applications without studying the underlying fundamentals.
You asked about bare metal MCU vs OS vs RTOS without properly defining the meaning of "real time".

For FFT and spectral analysis you need to learn the significance of sampling rate and size of sample record, i.e. number of data points. 10-bit ADC resolution determines the signal amplitude range and resolution, not the frequency spectrum range and resolution. For FFT, the spectral range is one-half the sampling frequency. The spectral resolution is the reciprocal of the duration of the total sample record.

"Real time" can mean many different things. For visual display on a computer screen, real time might be as short as 40ms response. It might also be even longer that 5 seconds depending on the nature of the information to be displayed. For example, a traffic web cam might show a real time screen update every 5 seconds.

I have a spectral analysis application that needs to measure signals higher than 10MHz. My sampling frequency is 50MHz. My record length is 16k sample points. This translates to about 328μs. The application has to run in "real time" meaning that no event must be missed. This means that the ADC runs continuously without missing a single data point, one point every 20ns. The data must be fully processed. This means that all data processes must be completed in under 328μs. The application displays the waveform and the spectrum on a computer screen in "real time".

Thus "real time" in this application means that data must be captured once every 20ns. Processing of 16k data points must be completed in under 300μs. These processes are running in the "background". In the "foreground" the MCU has to respond to USB communications.

I am running "bare bones" MCU with no additional accelerators such as any FPGA.

I am not using RTOS.
Why?
Because RTOS would be too slow for this application.

#### Kittu20

Joined Oct 12, 2022
482
You are presenting random applications without studying the underlying fundamentals.
You asked about bare metal MCU vs OS vs RTOS without properly defining the meaning of "real time".
This thread is not a continuation of previous thread. In a previous thread I understood that I don't need an RTOS application, I just need to learn the fundamentals of a operating system.

I have a spectral analysis application that needs to measure signals higher than 10MHz. My sampling frequency is 50MHz. My record length is 16k sample points. This translates to about 328μs. The application has to run in "real time" meaning that no event must be missed. This means that the ADC runs continuously without missing a single data point, one point every 20ns. The data must be fully processed. This means that all data processes must be completed in under 328μs. The application displays the waveform and the spectrum on a computer screen in "real time".
I will try to understand your design specification and timings

#### Kittu20

Joined Oct 12, 2022
482
Thus "real time" in this application means that data must be captured once every 20ns. Processing of 16k data points must be completed in under 300μs. These processes are running in the "background". In the "foreground" the MCU has to respond to USB communications.
Application take sample once every 25 ns.

If the processor takes more than 25 ns to sample the ADC, the resulting spectrum will display the wrong spectrum.

Processing of 16k data points must be completed in under 300μs

The processor will take sample once every 25 ns so that it would delay FFT operation.

Will this delay have any impact on system performance?

You didn't tell at what time interval screen should be updated in your application.

#### MrChips

Joined Oct 2, 2009
30,925
Application take sample once every 25 ns.

If the processor takes more than 25 ns to sample the ADC, the resulting spectrum will display the wrong spectrum.

Processing of 16k data points must be completed in under 300μs

The processor will take sample once every 25 ns so that it would delay FFT operation.

Will this delay have any impact on system performance?
You cannot allow that to happen. Any deviation from the sampling period (called jitter) will be translated into noise.
Jitter has to be in the order of picoseconds if sampling at 50MHz.

You need to learn about concurrent processing, i.e. two or more processes that can occur at the same time.

You didn't tell at what time interval screen should be updated in your application.
This is not as critical as the other two time constraints of 20ns and 300μs.
Human eye persistance is longer than 20ms. Thus for spectral display on a computer screen, even 100ms would be adequate.

#### Kittu20

Joined Oct 12, 2022
482
You cannot allow that to happen. Any deviation from the sampling period (called jitter) will be translated into noise.
Jitter has to be in the order of picoseconds if sampling at 50MHz.
For this we have to set timer interrupt e.g. every 20ns

You need to learn about concurrent processing, i.e. two or more processes that can occur at the same time.
In a single core CPU, if two processes are occur at the same time, the processor will execute the higher priority task first.

Data processing task is low priority task , but it can not delay more than 300 us.

Last edited:

#### Kittu20

Joined Oct 12, 2022
482
Thus "Processing of 16k data points must be completed in under 300μs.
Approximately how long does it take your processor to process the data point?

#### BobTPH

Joined Jun 5, 2013
9,120
For this we have to set timer interrupt e.g. every 20ns
On a processor with a 1GHz instruction rate, that would allow only 20 instruction times for the interrupt processing, including the hardware overhead for activating and returning from the handler.

I don’t think it could be done that way, at least not on any processor I am familiar with.

Most likely, DMA would need to be used, where the ADC writes directly to RAM, requiring no processor time.

#### Kittu20

Joined Oct 12, 2022
482
On a processor with a 1GHz instruction rate, that would allow only 20 instruction times for the interrupt processing, including the hardware overhead for activating and returning from the handler.

I don’t think it could be done that way, at least not on any processor I am familiar with.

Most likely, DMA would need to be used, where the ADC writes directly to RAM, requiring no processor time.
I am trying to understand how these two tasks are executed. Both are time critical tasks but both cannot be performed at the same time.

#### BobTPH

Joined Jun 5, 2013
9,120
If interrupts are used, yes, it will. That is how real-time systems must operate.

#### Kittu20

Joined Oct 12, 2022
482
If interrupts are used, yes, it will. That is how real-time systems must operate.
ADC sampling and data processing is a time critical task. Processor is busy completing these 2 tasks. I do not understand when the processor has time for the LCD update task.

LCD should be updated every 20 ms but if processor doesn't have time to do this task, can it suspend LCD task, when it has time to do so it will update LCD. so the LCD will update in 100ms instead of 20ms, even though it doesn't matter

#### BobTPH

Joined Jun 5, 2013
9,120
Why didn't you choose a processor capable of performing all the tasks?

#### nsaspook

Joined Aug 27, 2009
13,414
ADC sampling and data processing is a time critical task. Processor is busy completing these 2 tasks. I do not understand when the processor has time for the LCD update task.

LCD should be updated every 20 ms but if processor doesn't have time to do this task, can it suspend LCD task, when it has time to do so it will update LCD. so the LCD will update in 100ms instead of 20ms, even though it doesn't matter
Many controller task have fast acquisition and memory storage tasks that must have precise storage timing for those events but less critical timing for signal processing. For this the DMA module is used. The DMA module is also useful to reduce cpu and data transfers to displays like a lcd where you transfer a image work buffer to the display. So you don't suspend tasks, you assign tasks to the DMA controller and free the processor for actual compute tasks.

As others have said you need to pick a processor that can handle high speed math and data transfers with things like hardware floating point and multi tasking DMA with a high speed memory bus architecture.

#### Kittu20

Joined Oct 12, 2022
482
Why didn't you choose a processor capable of performing all the tasks?
Till now I have not been able to decide which processor will be suitable for this application and which processor will not be suitable for this application.

I found on internet search that ARM processor is used for such application. 8051 is not suitable for this application. This is because its speed is low and memory storage is less. I'm just reading it in more detail

#### nsaspook

Joined Aug 27, 2009
13,414
Even in the ARM architecture there are significant differences that should be inspected before making a choice. Processors specified for motor control usually have needed capabilities for high speed adc processing, dsp and I/O requirements to handle critical realtime applications.

#### Kittu20

Joined Oct 12, 2022
482
Even in the ARM architecture there are significant differences that should be inspected before making a choice. Processors specified for motor control usually have needed capabilities for high speed adc processing, dsp and I/O requirements to handle critical realtime applications.
The requirement is that the ADC sample must be taken within 25 nanoseconds. I'm trying to figure out how long it will take for the 8051 or 8 bit PIC to collect one sample. does it take more than 25 nanoseconds so that's reason it is not suitable for this application?

Last edited:

#### nsaspook

Joined Aug 27, 2009
13,414
The requirement is that the ADC sample must be taken within 25 nanoseconds. I'm trying to figure out how long it will take for the 8051 or 8 bit PIC to collect one sample. does it take more than 25 nanoseconds so that's reason it is not suitable for this application?
A somewhat advanced 8-bit pic like the PIC18 Q43 has a DMA memory cycle of about about 100ns and an onboard 140 KSPS 12-bit ADC. Then you have the buffer memory that's limited to maybe 8192 bytes. So no, it's not suitable for that application.

#### nsaspook

Joined Aug 27, 2009
13,414
Here is what you can do with a capable 32-bit chip with DMA.

LCD driver update time.

C:
#if DISPLAY_TYPE == 240
#define    cbOledDispMax        3840        //max number of bytes in display buffer
#define    cbOledDispMax32        cbOledDispMax/4    //max number of 32-bit words in display buffer
#define    ccolOledMax        240        //number of display columns
#define    crowOledMax        128        //number of display rows
#define    cpagOledMax        16        //number of display memory pages
#define STR_BUF_SIZE        160        //number of chars for display strings
#endif
...
/* This array is the offscreen frame buffer used for rendering.
** It isn't possible to read back frome the OLED display device,
** so display data is rendered into this offscreen buffer and then
** copied to the display.
*  must be in uncached memory for pic32 DMA so use __attribute__((coherent))
* DMA0 SPI TX transfers DATA, CMD
* DMA1 GLCD buffer transfers
* DMA2 SPI TX transfers CMD, NOT USED
*/
#ifdef __32MK0512MCJ048__    // NO bank 2 for this CPU so memory is in bank 1
uint8_t __attribute__((address(BANK1), coherent)) rgbOledBmp0[cbOledDispMax]; // two display buffers for page flipping
uint8_t __attribute__((address(BANK1 + cbOledDispMax), coherent)) rgbOledBmp1[cbOledDispMax];
#ifdef USE_DMA
static uint8_t __attribute__((address(BANK1 - 8), coherent)) rgbOledBmp_blank[4] = {0x00, 0x00, 0x00, 0x00}; // 32-bit frame-buffer clearing variable
#endif
volatile uint8_t __attribute__((address(BANK1 - 16), coherent)) rgbOledBmp_page[5];
#endif

PIC32MK sensor board.

That update takes 2.33 ms using 15MHz SPI in background while the cpu is doing other things like computing sensor (via SPI using DMA) data with a FFT routine for spectrum analysis, vibration detection, and transmission via CANBUS and Ethernet.