cmolsen
Junior Member
Offline
Posts: 10
NY
|
I'm running a transient simulation, using spectre MMSIM72, on a circuit with several CMOS inverters in a chained configuration. Input is isolated from output.
The compact model for the FETs is the VerilogA PSP 102.3 model which I have modified to significantly reduce the capacitance contributions. There's no other components in the circuit.
As the length of the inverter chain increases, the total current consumption as well as the output voltage waveform become increasingly unstable.
For example, when completely removing all capacitance so that I have a purely resistive circuit, the transient simulation will produce anticipated results for 2 inverters. At 3 inverters logic levels start to become unrealistic. At 10 inverters, the simulation fails due to convergence problem.
Generally speaking, the trend is that the smaller the capacitance is, the smaller a circuit the simulator can handle before failing due to convergence problem.
I see the same trend in Hspice. I also observe this with other compact FET models.
Is this is a simulator problem?
If so, would it make more sense to run SpectreAMS or Spectre APS on a purely, or very near, resistive type of circuit where I'm mainly interested in determining the logic states and current consumption in the various [stabilized] logic states?
I'm running on a Linux RedHat system.
---Michael Olsen
|