Misha
New Member
Offline
Posts: 6
|
Magnetic model convergence problem
Feb 19th, 2004, 9:25am
Hi, I guess this is for Ken Kundert mostly.
First I want to say thanks a lot for posting a lot of useful material, especially the paper on the magmetics modeling. I had found the same VerilogA code as in the Cadence's ahdlLib, but the paper really helped me to figure out what I have to do to model my device.
However, I'm having convergence problems in Spectre using this VerilogA magnetic core model when I apply fast changing signals to the widings. In most cases Spectre bails out with message "no convergence at min time step". The offending nodes are usually the Hdot: either MMF(Hdot) exceeds the blow-up limit or the change is more than abstol+reltol. This is really tough, since this internal node stores the derivative of H and this can be high.
Am I missing something or is there a work-around for it?
Also, how do I set the initial magnetization state of the core at the beginning of the simulation?
Thanks in advance,
Misha Ivanov
|