I just took a look at the Telecordia spec and found the following:
Quote:For a system with a linear jitter transfer function, jitter transfer measurements can
be made (and identical results can be obtained) using sinusoidal jitter applied to the
input signal at any level up to the jitter tolerance level for that interface and that
specific jitter frequency. However, SONET systems typically do not have linear
jitter transfer functions (both by design and due to inherent factors such as the
limited number of stuff opportunity bits available in the asynchronous DSn to VT or
STS SPE mappings), and therefore the results obtained in any jitter transfer tests
are likely to depend on the particular input amplitudes used. In general, the primary
purpose of the jitter transfer requirements is to prevent performance degradations
by limiting the accumulation of jitter through a series of systems such that it does
not exceed the network interface jitter requirements (or the jitter tolerance of any
of the NEs involved). Thus, it is more important that a system meet the jitter
transfer criteria for relatively high input jitter amplitudes (e.g., amplitudes close to
the network interface jitter or jitter tolerance limits) than for very low input
amplitudes. Therefore, for testing the conformance of a system to the jitter transfer
requirements in this document (e.g., to R5-236 [338] or R5-237 [339]), the input
jitter amplitude range is limited to 0.1 to 1.0 times the amplitude given by the
appropriate jitter tolerance mask. (That is, the jitter transferred through the system
must be under the jitter transfer mask for any input jitter amplitude within this
range, but is not required to be under the jitter transfer mask for input amplitudes
outside of the range.)
This suggests that jitter transfer is not a simple linear transfer function as I assumed, which implies predicting jitter transfer using simulation would involve running a series of transient simulations where a sinusoidal jitter is applied to the input to the CDR over a range of frequencies and with an input amplitude equal to 0.1 to 1.0 of the jitter tolerance mask to determine the DJ, and then following that with corresponding phase domain model simulations to determine the RJ, and finally combining those to find the jitter transfer.
I have never done these measurements so I have no idea how much the DJ changes as you change the input amplitude of the input jitter. Do you have any experience with this? Is this a significant effect?
-Ken