badminton
New Member
Offline
Posts: 4
|
Hello all, I am designing the clock generation circuit for a pipeline ADC and I have some conceptual questions about methods for verifying the jitter performance of my circuit.
If I understood correctly, one way to measure jitter in spectre is to run pss, save the bias point of all xtors at the time instant the final output buffer stage reaches a certain threshold, linearize the circuit about the bias point, and compute the power spectral density at the output of the linearized circuit. The integral of the power spectral density over all frequency then gives the noise variance at the buffer output at the time instant the output reaches a certain threshold. An estimate for jitter can then be obtained by computing sqrt(noise variance)/(slew rate).
However, doesnt this method assume that other than the time instant when output hits threshold the buffer output was completely free from noise? In reality, at the onset of a clock transition, noise would be throwing the "trajectory" of the output voltage off from its noise-free trajectory; and the longer it takes the clock to transition, the more the trajectory gets thrown off its course. So it seems noise behavior over the entire clock transition from start to threshold should be somehow accounted for when we compute noise variance, and not just consider noise variance at the time when output reaches threshold. Also, if this were true, it seems estimating jitter by pss would underestimate jitter.
Thanks in advance!
|