borisk
New Member
Offline
Posts: 6
|
All,
when simulating the output phase noise added by a digital gate, I see a strong dependence on the clock duty cycle. Is this expected behavior?
For instance, if the input to a digital gate is 50%-50% duty cycle the output phase noise is low, whereas it goes up by 20dB or more for a 1/60%-59/60% duty cycle.
I can explain this by the fact that the input spectrum changes as the duty cycle changes, i.e., (with decreasing pulse width, the amplitude of the fundamental will decrease), thus in terms of an SNR it would appear the be worse --> higher phase noise.
This however doesn't intuitively make sense when one is only concerned with edges (freq/jitter between successive edges). Also, if viewed as sampling noise, then the convolution of baseband noise by a lower signal amplitude should yield the same phase noise as noise also should get scaled in amplitude.
Any ideas?
|