mohta
Junior Member
Offline
Posts: 13
|
Thanks for your post Stefan.
I know that simulation time is greatly increased with transient noise. However, transient simulation itself (and transient noise as an "add-on" that modifies operating point information at each time step) is supposedly vastly more efficient these days - Berkeley Design Automation's Analog FastSpice is promising 5-10X faster transient simulation than Spectre, so it may make transient noise simulation on delta-sigma ADCs a practical reality in the near future (even if it is a "start simulation on Friday night, view results Monday morning" sort of thing).
My question was more directed towards the theoretical validity of this approach - assuming you had the time/resources/tools to do a transient noise simulation, then is it a reasonable way to capture thermal/quantization noise simultaneously in a a delta-sigma ADC? If nothing else, it can give additional validation of system level simulations and hand calculations (e.g. Ken Kundert's paper on device noise modeling in delta-sigma ADCs).
I have another thought/question: just like we have different sampling methods in Monte Carlo analysis (like Latin Hypercube Sampling) to get an idea of the worst-case spread without running a huge number of Monte Carlo runs, is it possible for transient noise analysis to give a realistic estimate of the combined noise with fewer runs? Would running a small number of simulations starting from different seed values (and using some clever sampling technique) be equivalent to running one simulation for a longer amount of time?
Thanks, mohta
|