Andrew:
This is an old post however I revisit recently.
I agree that for a driven circuit we probably need to use jitter mode for phase noise measurement.
However, in oscillator simulation, it seems that source mode works pretty well in plotting the correct phase noise. You think it is just being coincidental or there is some reason behind it.
Thanks,
Neo
Andrew Beckett wrote on Jan 5th, 2010, 1:36pm:It doesn't. Phase noise is really a misnomer with the result from sources. It's actually plotting the output noise in dBc (i.e. total output noise divided by the magnitude of the carrier). For a switching system, including the amplitude noise in the non-transition regions is of no use to you, so that's why you want to use the jitter mode.
In other cases, seeing the time-averaged output noise, presented in dBc, is what you want, so that's why you have a choice of ways of measuring it.
Although I do feel that we (Cadence) should stop calling it phase noise when computed from the pnoise sources analysis, because it's potentially misleading if you don't know what you're doing...
Unfortunately it's been this way for ever, so we are more likely to upset existing users if we change it!
Regards,
Andrew.