nishant22
New Member
Offline
Posts: 8
|
I have to model a variable resistor, the model exists as schematic at transistor level and I wana model it top level but still retaining the functionality. So, as I have to model the resistance of the circuit which should just check if the voltage is near to the supply voltage, the output should be whatever my input is. But as the voltage gets below vdda(765mV)-0.1 the resistance should increase linearly(i do not have fixed value with which it should decrease) and output voltage should vary based on resistance and as it gets near the minimum voltage 0 V the resistnace should be around 1G just to make my output voltage really small. I tried modelling it as I(vdda, out) <+ (I(vdda, bias) <= V(vdda, vssa) ) ? I(vdda, bias) :I(vdda, out)/(I(vdda, bias)/V(vdda, bias)); But i get convergence issues so is there a better way to model the resistance. As my input is current as I(bias) i also define what should be direction by I(vdda, bias) <+ V(vdda, bias)/5000; Can someone help and suggest what can be a better way. Also as the outputs go as inputs to other block I would like them to as well detect the change in input current and change accordingly, so i definitely need to get this block working as I would expect it to be. I thought to model it as switch but that would not retain the workin of transistor as I would like to. Thanks, Nishant
|