aaron_do
|
Hi all,
I want to clarify something about the jitter spec for ADCs used in data communications. First, suppose I only need about 20 dB SNR for good BER for my signal. Now assume the maximum signal frequency is about 100 MHz. So according to a simple hand calculation, the jitter requirement for the PLL used in my ADC is 160ps.
However, suppose at the same time, my signal is accompanied by a blocker. For simplicity, assume my SNQR is 6*b and assume I have a 14-bit ADC. So now my maximum SNQR is 84 dB. In the worst case scenario, my blocker is full scale, and my signal is 20 dB higher than the quantization noise level. If I use this number for the SNQR requirement, then I need the jitter to be less than 0.1 ps.
So which is the correct requirement? I'm thinking the second answer is correct (0.1 ps), so the RF designer needs to make sure that the blocker is properly filtered before the ADC.
Also, assuming the first calculation is right, if my PLL really has a jitter spec of 100 ps (for example), are there going to be any other consequences?
thanks, Aaron
EDIT: I guess I can simply treat the jitter as smearing the signal...therefore, if the blocker is far enough removed (in frequency) from the desired signal, then it won't affect the desired signal.
|