Eugene
|
I am simulating PLL phase noise with a VerilogA voltage domain PLL model. (The VCO in a voltage domain PLL outputs an oscillatory voltage. A phase domain VCO model outputs a ramp. In this case, the phase domain VCO model outputs a DC voltage in steady state.) The VerilogA model does different things for different analysis. I use a phase domain model for DC analysis to precharge the loop filter. That way, I don't have to simulate long start-up transients. I switch between phase and voltage domain models with an "if" statements that depends on the type of analysis being peformed (DC or transient). For DC analysis, the VCO model generates an output voltage numerically equal to the VCO frequency in MHz, which is around 3700. For transient analysis, the model outputs a much smaller voltage, +-1 volt, representing the true oscillatory output voltage of the VCO. The 3700 volt output during the DC analysis seems to set some sort of dynamic numerical tolerance such that the subsequent transient analysis does not see much of the VCO phase noise. The resulting power spectral density is 20-30 dB too low. If I decrease reltol from 1e-3 to 1e-6, the simulation sees the noise and produces the correct power spectral density.
Does anyone know of a way to fix this problem from within the VerilogA VCO module so I don't have to remember to tighten reltol?
|