jdac_18
Junior Member
![* *](https://designers-guide.org/forum/Templates/Forum/default/starblue.gif)
Offline
Posts: 12
|
I am currently simulating the effects of jitter on the dac output. I want to see how much noise is contributed and then see the corresponding SNR. I have been reading the paper with the equation of 2*pi^2*fin^2*A^2*sigma^2. When I calculate the noise power using the equation, my results are quite different from my simulated noise power. The theoretical results are 2 orders of magnitude smaller.
My simulation setup is just using the an ideal verilog dac with the fixed frequency oscillator with accumulating jitter as the clock. Then i take the fft in matlab and calculate the power and subtract out the signal, dc, and harmonics.
I was wondering if this is even the right type of jitter? The paper specifies gaussian random jitter. Also the sigma I am assuming is the standard deviation of the period.
The paper title is clock jitter induced distortion in high speed cmos switched current segmented digital to analog converters.
|