Hello,
I have come across a paper which makes a point which confuses me.
In the paper a digital-to-frequency modulator is described. The input to the VCO is the output of a sigma delta modulator (+/-1’s). Thus, the output frequency switches between two frequencies only. Naturally, the output of the SDM will have quantization error. It is stated in the paper (top-right column, 2nd page) that since the VCO will attenuate signals outside its frequency bandwidth, the quantization error will appear attenuated if the frequency of the SDM is high. In essence, the VCO acts like a band-pass filter.
If the frequency of the SDM is high then the pulse-repetition-frequency of the +/-1s stream is high and BOTH the signal and quantization error will get slightly attenuated due to the transfer-function of the VCO which is Kvco/(1+s/w). So, how can only the error get attenuated? On the other hand, I can sort of see his point when considering Leeson’s equation which represents a band-pass transfer function. The noise outside would be attenuated. So which should I consider to be correct: Kvco/(1+s/w) or Leeson’s equation (I mean shaped proportional to 1/w^2).
Cheers,
Sven
The URL for the paper is:
http://www.mit.edu/~ddaly/research/files/iscas_paper.pdf