Visjnoe
|
Dear all,
recently, somebody specified a CDR (using a PLL) with following specification: the incoming data has a peak-to-peak jitter of 2ns, we would like for the output clock to have a peak-to-peak jitter of 100ps.
I think - but maybe I'm wrong- that this specification makes no sense: normally, for CDR, one specifies jitter transfer, jitter generation and jitter tolerance. The jitter tolerance mask provides the designer with a specification of the jitter amplitude versus frequency. This way, he can start the CDR PLL design.
However, given the just the 'time-domain' jitter of the incoming data, I think the PLL design can not be started, since the frequency distribution of the jitter is not know.
That being sad, I also think it is impossible to measure the jiter versus frequency for a given data/bit stream (no experience here).
Can somebody provide me with a second opinion on this?
Kind Regards
Peter
|