Anish
New Member
Offline
Posts: 4
|
Hello, I am simulating an LC oscillator that drives a CMOS div4 and div5. I am interested in the Power Supply induced jitter at the output of the LC oscillator. To achieve this, I do the following- 1) Run PSS in oscillator mode with a beat frequency of LC_freq/20. I point to the oscillator output as the oscillator nodes. 2) Run Sampled PAC and find the voltage gain from my supply to the output of the LC (vgain). 3) Find the slope of my waveform from pss_td (slope) and use vgain/slope as the jitter induced by 1V movement on supply.
I have the following questions- 1) I know that I am supposed to point to the divider output as the oscillating node for PSS and specify that as the beat frequency. But since I have two dividers, which one do I pick? and what should the beat frequency be? With lc_freq/20 as beat frequency and lc output as oscillating node, I see convergence issues as tstab tran is choosing lc_freq/36 (?) as fundamental. If I pick lc_freq/10 as beat frequency and lc output as oscillating node, some corners converge while others pick lc_freq/4 as fundamental.
2) Is the method I use for finding Power Supply induced jitter correct?
Thanks, Anish
|