user0088
New Member
Offline
Posts: 1
|
Hello all,
I am working on Sigma-Delta DAC, which last stage is SC Filter. Modeling is performed in Cadence design environment.
Could you please advice how to put correct simulation settings for performing FFT and measuring SFDR.
Currently I am doing as the following: 1. Digitized sin signal is applied with Fin=125KHz frequency. 2. Transient analysis is done. Fclk=32MHz. Run time is "8us+2048*31.25n"=72us where 8us is a circuit stabilization time, 31.25n is a 1/32Mhz sampling period.
strobeperiod=Fclk/10 is used. !!!
3. Perform dft (with calculator) From - 8us To - 72us Number of Samples - 2048 window type - Cosine2
Am I missing something????
The question arose.... The SFDR of of SC filter output shows about 80dB, whereas when I use IDEAL LP filter after SC filter the SFDR is significantly drops.
I looked at the specter of the SC filter out for [8u-x...72u-x] range also, where x=(0...9)*strobeperiod. There are some x values, for which SFDR is "not good". So, therefore, the LP filter averages the values...... is it so????
Are my simulation settings correct?
Please help. Thanks in advance.
|