ADC Spectrum

Thread Starter

hanifkhan5292

Joined Nov 23, 2025
1
Suppose I want to analyze the spectrum of my ADC output. I choose:
Clock frequency: 100 MHz
Sampling frequency: 10 MHz
Input frequency (fin): M/N × 10 MHz M= any prime number, N may be 1024/2048/4096 (e.g 79/1024)
In this case, what should be my minimum simulation time, start time, and end time?
In the attached picture, I set fin = 7/64 × 10 MHz, and in the spectrum analysis, I set a random time range like 0.2 μs to 45 μs (I don't know the logic—I'm just experimenting with these numbers). I ran the simulation for 50 μs with a strobe period of 1/fs, and I got the expected result.
But my question is: if I change the number of sampling points (N) or fin, how can I determine the simulation time, start time, and stop time to achieve coherent sampling? Is there a proper way to calculate these manually or analyze them from the output signal? I even followed the Cadence documents for finding these numbers, but they didn't work in my case—for example, when changing from "determine sampling frequency" to "determine stop time" or "determine start time."
Additionally, if I change the window type (from rectangular to Kaiser), what is the impact on these values: start time, stop time, bins, and sampling points?
 

Attachments

Top