Hello everyone, I have wrote a verilogA model and want to run Monte Carlo Simulation in Cadence Spectre ADE. Now the model is written in standard parameter values.
Now I have a set of random parameters (I know the distribution wave of them), what I want to do is have their value=0 when I run normal simulation in ADE (DC, AC, Tran) and when I run Monte Carlo, those parameters start to change according to the distribution I have. Anyone knows a method to deal with it?
I tied rdist_normal function but function makes it changing even in normal simualtion. Any idea would help! Thanks!