The Designer's Guide Community Forum
https://designers-guide.org/forum/YaBB.pl
Design >> RF Design >> Degradation of SNR due to LO Phase Noise
https://designers-guide.org/forum/YaBB.pl?num=1405987131

Message started by rf_noobie on Jul 21st, 2014, 4:58pm

Title: Degradation of SNR due to LO Phase Noise
Post by rf_noobie on Jul 21st, 2014, 4:58pm

Hello,

I was wondering if someone could direct me to an analysis of the impact of LO phase noise on the degradation of mixer NF. This is for a BFSK receiver, so naturally phase noise will directly reduce BER just because of smearing of the two tones. Assuming no interferers, I think it should also reduce input sensitivity, but I'm not sure how the phase noise "turns into" signal path amplitude noise. It would be nice if there was an expression like:

NF_mixer = NF_mixer_perfect_LO + f(rms_jitter)

Preliminary simulations suggest that it has an enormous impact on the NF - nearly a 25dB difference between perfect LO and an 'actual' LO - but I cannot think of a good way to analytically quantify it to determine how good the jitter needs to be to meet receiver specs.

The closest things that I have found so far are reciprocal mixing (which seems kind of obvious - I'm trying to find an expression w/out an interferer) and SNR degradation in a sample+hold circuit. The second case would be perfect, but because LO and RF are of a similar frequency, there will be aliasing so saying that S/N = (1/2πσf)^2 is not correct... I think.

The answer is probably right in front of me but I don't seem to be looking for the right thing. Any help is appreciated, thanks!

Title: Re: Degradation of SNR due to LO Phase Noise
Post by RFICDUDE on Jul 21st, 2014, 6:21pm

Reciprocal mixing is the way I think about it. Any noise on the LO signal prior to the mixer will be multiplied with the mixer input signal.

Sensitivity degradation: the noise level would have to be very large because you are multiplying a small signal (input) with another small signal (LO noise). The resulting noise from a dBc point of view will be below the small desired signal by the same amount it is below the LO signal (dBc). So, unless your LO noise is <30dBc it probably isn't a sensitivity (NF) problem.

Blocking SNR degradation is a real issue as a large blocker will be multiplied by the LO and the noise will have the same dBc relationship on the big blocker. You just need to know the blocker level, the dBc LO noise at the desired channel offset to the blocker and then integrate the noise over the desired channel and add it into the thermal noise (in power).


Title: Re: Degradation of SNR due to LO Phase Noise
Post by aaron_do on Jul 21st, 2014, 6:54pm

Hi,


These are just my own educated guesses, so read at your own risk :D.

I think the LO phase noise will directly add to the phase noise of the output signal. But for the SNR, it would depend on the signal itself. In your case, since you are using a frequency modulation, you need to see the effect of the phase noise on the output frequency. Following this,

http://en.wikipedia.org/wiki/Frequency_modulation

I guess you can say the signal phase without the carrier is equal to 2πΔf∫x(τ)dτ + φn(t). So that would make the signal frequency equal to 2πΔfx(t) + d/dtφn(t)]. So if I were doing the calculation without having read a book on communications theory, I would get my frequency domain phase noise plot, and multiply by 2πf, then integrate over frequency to get the noise power. I guess the signal power would simply be (2πΔf)2.

Probably a book on communications theory would give you a better answer.


Aaron

The Designer's Guide Community Forum » Powered by YaBB 2.2.2!
YaBB © 2000-2008. All Rights Reserved.