I want to perform a transient noise simulation in Cadence (because I want to include non-linear effects) but I keep getting a slight underestimation of the noise.
This is my test-case:
A vsource from analogLib, connected to a 1M resistor, which generates no noise. In the vsource, I specify 1 noise/freq pair: Freq 1 = 1, Noise 1 = 1. This should give me a flat noise PSD of 1V**2/Hz.
In the tran analysis, I specify:
Stop Time = 0.1m
Noise Fmax = 1G
Noise Update = fmax
Noise contribution = on, Instance list = the vsource component
All the rest default
The simulation runs without warnings or errors.
As a result, I plot the transient waveform at the vsource output, and from it, I calculate the noise PSD in V/sqrt(Hz):
sqrt(((rms(tran_out)**2) / VAR("BW")))
where tran_out is the transient waveform and BW is a variable for the bandwidth.
The result should be 1, as specified in the vsource. It is always smaller however. No matter which parameter I change, it is between 5-20% below the expected value.
When I increase the length of the tran simulation, I would expect that the extreme values in the transient voltage will increase, because the noise is supposed to be a gaussian process, so the probablility of larger extreme values should increase with simulation time. This does not happen however. I have a feeling that the output signal is somehow limited, which could result in a lower rms value of the noise.
The regular noise analysis does give me the correct result.
Does anyone have any idea what could cause this underestimation of the transient noise?
Thanks a lot!