AnalogDE
|
I'm working on the ZQ calibration circuit. If I'm reading the spec correctly, the circuit needs to be capable of sensing a 0.5% change in output driver impedance. Translated into a deltaV I calculate this to be 1.8mV at the input of the comparator. This means I need to design the comparator to have an offset voltage =1.8mV (say over 3sigma) or does it need to be less, say half that?
If it's half that, then I need to burn 4x the power and spend 4x more area on the comparator. I'm thinking it's OK to design it to be =1.8mV over 3sigma... is my thinking wrong here?
|