aaron_do
|
Hi all,
In a down-conversion mixer, we normally say that the effect of oscillator phase noise is to result in unwanted down-conversion of interfering signals into the desired IF. For example in the direct-conversion receiver (DCR), if the phase noise at 1 MHz offset mixes with an interferer at 1 MHz offset from the desired tone, it will cause in-band distortion at the IF.
Normally, in order to calculate the required phase noise performance of the oscillator, we assume that there is a linear relationship between the gain and the phase noise power. For example, if the total phase noise in a 1 kHz BW is -50 dBc (at 1 MHz offset), and the gain due to the desired LO carrier is 0 dB, then the undesired gain at 1 MHz offset is -50 dB over the 10 kHz BW.
So my question is, how accurate is this model? I expect that it is different for active and passive mixers. Both active and passive mixers show saturation in the conversion gain for high LO power. Any thoughts are helpful and welcome.
thanks, Aaron
|