The Designer's Guide Community
Forum
Welcome, Guest. Please Login or Register. Please follow the Forum guidelines.
Apr 18th, 2024, 4:02am
Pages: 1
Send Topic Print
Weird Results with FFT in Cadence Spectrum Assistant / DFT Function (Read 442 times)
repah
Community Member
***
Offline



Posts: 68

Weird Results with FFT in Cadence Spectrum Assistant / DFT Function
Apr 30th, 2020, 12:08pm
 
I am doing a behavioural simulation of an ADC sampling at 15Gs/s.

I have a behavioral ADC followed by a behavioural DAC.

I want to see the effect of changing my input frequency on the SNDR/ENOB of my ADC.

So, I input 100MHz frequency input, then 1.1G, 2.1G, 3.1G, 4.1G, 5.1G and 6.1G.

I use strobe period in transient to sample the waveform at the output of the DAC at 1/15GHz, my clock frequency.

I use both the DFT function in Cadence and also do my own coherent sampling using FFT for a sanity check and also use Spectrum Assistant in Cadence Spectre IC6.1.7 to triple check.  I always go up to FS/2 for DFT/FFT or 7.5GHz in my case for a 15GHz sampling clock.

What I notice is that I get higher SNDR and thus ENOB as I increase the input frequency from 100MHz to 6.1 GHz, whereas I think I should the opposite effect, a slowly decreasing SNDR/ENOB until I reach the Effective Resolution Bandwidth (ERBW) of the ADC.

I am not sure whether I am doing something wrong with my test bench or simulation setup in terms of calculating or setting up my DFT/FFT or whether it is something else.

Can anyone shed some light or insight into what I may be doing wrong ?

Thanks.
Back to top
 
 
View Profile   IP Logged
Pages: 1
Send Topic Print
Copyright 2002-2024 Designer’s Guide Consulting, Inc. Designer’s Guide® is a registered trademark of Designer’s Guide Consulting, Inc. All rights reserved. Send comments or questions to editor@designers-guide.org. Consider submitting a paper or model.