repah
|
I am doing a behavioural simulation of an ADC sampling at 15Gs/s.
I have a behavioral ADC followed by a behavioural DAC.
I want to see the effect of changing my input frequency on the SNDR/ENOB of my ADC.
So, I input 100MHz frequency input, then 1.1G, 2.1G, 3.1G, 4.1G, 5.1G and 6.1G.
I use strobe period in transient to sample the waveform at the output of the DAC at 1/15GHz, my clock frequency.
I use both the DFT function in Cadence and also do my own coherent sampling using FFT for a sanity check and also use Spectrum Assistant in Cadence Spectre IC6.1.7 to triple check. I always go up to FS/2 for DFT/FFT or 7.5GHz in my case for a 15GHz sampling clock.
What I notice is that I get higher SNDR and thus ENOB as I increase the input frequency from 100MHz to 6.1 GHz, whereas I think I should the opposite effect, a slowly decreasing SNDR/ENOB until I reach the Effective Resolution Bandwidth (ERBW) of the ADC.
I am not sure whether I am doing something wrong with my test bench or simulation setup in terms of calculating or setting up my DFT/FFT or whether it is something else.
Can anyone shed some light or insight into what I may be doing wrong ?
Thanks.
|