subtr
Community Member
Offline
Analog Enthusiast
Posts: 52
USA

I don't think I have heard the term called phase jitter. Jitter is a representation of the inconsistency in timing of the noisy output clock with respect to an ideal clock which should have been absolutely based on the period dictated by the input or the autonomous system.
Now this randomness can be represented as samples whose magnitudes at each of the timing edges represents the difference between the noisy edge and the ideal edge. This being a waveform in time has a frequency representation. The random nature of this waveform only lets us represent it in power spectral density format. This frequency domain representation is called phase noise.
Now every random noise has 2 representations which put together define it. First is the shape of PSD and second is the shape of magnitude density function(MDF). PSD for white noise is flat, flicker is 10dB/decade. PSD for an ADC quantization noise is assumed to be white. While MDF for uncalibrated mismatch, phase noise, voltage noise etc. is gaussian, the quantization noise is assumed to be uniform.
You can find variance from intagrating PSD or from MDF directly using probability theory. In fact PSD only tells how the same noise power is distributed in frequency. Noise power can be seen as variance of voltage/phase etc. which is why the term noise power.
Now based on long term or short term, you can apply a sinc filter represented by (1z^(t/T)). For t=T, we get a simple high pass filter. This means you look at one period jitter when you integrate by applying this filter on the phase noise plot. You can also think that the duration for which we see actually limits the effect of lower frequency noise to even appear effectively becoming a high pass filter. By increasing the time duration for which you're looking at the accumulated noise, your HPF cut off goes towards DC.
