The Designer's Guide Community Forum
Simulators >> Circuit Simulators >> Simulation of output noise of a differential VCO

Message started by UCBEL on Sep 24th, 2019, 6:04pm

Title: Simulation of output noise of a differential VCO
Post by UCBEL on Sep 24th, 2019, 6:04pm

Hi all,

I am working on a VCO based ADC and I am having some trouble figuring out how to simulate the noise at the output of my input cell.
As shown in the attached file, the cell is composed of a differential pair that is controlling the current in two ring oscillators (RO).
I am interested in the phase difference at the output of the two RO. To simulate the effect that noise would have on the accuracy of this phase difference, I am trying to use PSS +PNOISE to try to extract the "differential" phase noise between OUTp and OUTn .

Since the two oscillators are supposed to be in phase and at the same frequency, I run the PSS only on one branch setting the oscillator node + to OUT_p and node - to gnd. This converges quickly and gives me the correct result (matches transient sim).

To simulate the differential phase noise, I set PNOISE as time-averaged (PM) between OUT_p and OUT_n and plot the output noise. I don't think this is the right way to do it as it gives me strange and inconsistent results.
First, I get wild variations (>50dB) in phase noise if I vary simulation parameters such as the input common-mode voltage. This is totally unexpected and doesn't match with transient noise simulation results.
Second, when I look at the noise summary, I would expect the noise to be dominated by the input transistors and the noise of the bias to be cancelled due to the differential measurements. This is not always the case and again varies wildly for the smallest change in simulation parameter.

I could not find any answer online and any suggestions would be greatly appreciated.

Thank you

The Designer's Guide Community Forum » Powered by YaBB 2.2.2!
YaBB © 2000-2008. All Rights Reserved.