Hello,
I'm new to ADC/DAC world but has been reading up a lot on them lately. However one thing is still confusing that is how to calculate the range of a clock input to the ADC from a given spec like a sampling rate of 1.25Mbps. What is the formula to calculate it? It seems to me that another given spec like N bits is also needed.
Thanks
I'm new to ADC/DAC world but has been reading up a lot on them lately. However one thing is still confusing that is how to calculate the range of a clock input to the ADC from a given spec like a sampling rate of 1.25Mbps. What is the formula to calculate it? It seems to me that another given spec like N bits is also needed.
Thanks