Visjnoe
|
Dear all,
I'm examining a 1.8GHz CMOS VCO and its phase noise performance. The VCO has discrete frequency tuning by means of a capacitor bank. The capacitor bank is a classical implementation, meaning binary scaled capacitances which are switched by an NMOS transistor in series.
When I examine the phase noise contributions (phase @100kHz) for a certain capacitor bank setting, I notice that the largest contributions come from the NMOS switches that are in the OFF state (thermal noise).
This is the first time I see switch transistors (in the OFF state) dominating VCO phase noise performance and I'm doubting the simulator. For one, I don't think thermal noise is well modeled for transistors in the OFF state.
Has anyone encountered this before?
Regards
Peter
|