The Designer's Guide Community
Forum
Welcome, Guest. Please Login or Register. Please follow the Forum guidelines.
May 3rd, 2024, 12:52am
Pages: 1
Send Topic Print
SAR ADC input stage and speed limitations (Read 2122 times)
aaron_do
Senior Fellow
******
Offline



Posts: 1398

SAR ADC input stage and speed limitations
Feb 27th, 2014, 5:24pm
 
Hi all,


I'm working on a high-speed SAR ADC, and in order to minimize power consumption, I'm not using an input buffer. Regarding the input interface, I have a couple of questions.

1) Do we normally provide a 50-ohm interface? Using a shunt resistor for example? Either way, the load to the source will be changing, so will this cause problems?

2) The input interface looks like a series RLC resonator with the bondwire as the inductor, and the DAC capacitance as the capacitor, and of course the source + switches as the resistor. My question is do we normally de-Q the input? If I set the network Q to 0.5, I will avoid ringing on the input.

The second part of my questions is related to speed. Basically, after doing some simulations, I find that the input interface is ultimately limiting the speed of the ADC. For instance, if I have a 10b ADC, then I need around 8τ for the input to settle. My DAC capacitance is set by the noise and matching requirements. The matching requirement can be somewhat loosened through calibration, but the noise requirement cannot. Suppose I have 1 nH of bondwire inductance, and a 0.5 pF capacitor, and I de-Q the network to Q = 0.5, then the settling time is entirely dependent on sqrt(LC).

So the question is, is there a typical way to get around this? I see that time-interleaving is quite common, and pipe-lining might help somewhat.


thanks,
Aaron
Back to top
 
 

there is no energy in matter other than that received from the environment - Nikola Tesla
View Profile   IP Logged
Pages: 1
Send Topic Print
Copyright 2002-2024 Designer’s Guide Consulting, Inc. Designer’s Guide® is a registered trademark of Designer’s Guide Consulting, Inc. All rights reserved. Send comments or questions to editor@designers-guide.org. Consider submitting a paper or model.