aaron_do wrote on Jan 6th, 2014, 2:07am:Hi all,
taking a pipelined ADC as a reference, but really this question is more general, can we always trade resolution for speed? What I mean is, suppose I need a 12b 100 MS/s ADC, can I design a 10b 400 MS/s ADC and then average out samples in the digital domain?
My doubt is that my 10b 400 MS/s ADC is only 10b accurate, but I guess that shouldn't be an issue since sigma-delta ADCs work perfectly fine...
The second question is, if this kind of oversampling is possible, why do people still design 12b 100MS/s ADCs? For a pipelined ADC, I believe you run into a lot of problems above around 10b resolution.
All opinions are welcome.
thanks,
Aaron
Hi Aaron,
Yes, in general you may directly trade resolution for speed in any Nyquist-rate ADC. It is done very often in industry to relax the requirements on anti-alias filtering.
However, you're trading resolution for speed, not accuracy for speed, so be careful. If you have a 10b 400 MS/s ADC you will lower your quantization noise to a 12-bit level if you decimate from 400 MS/s to 100 MS/s. However, like you said, if you had a true 10-bit ADC (+/- 0.5 LSB INL/DNL) then you would now have +/- 2.0 LSB INL/DNL.
Also, in some technologies it is much easier to design more accuracy than more speed (once you hit the technology's limit).
The sigma-delta ADC has really high linearity because it uses a low-bit (usually one-bit) quantizer that is easy to make linear. Get 10-bit raw linearity is hard, as you know.
People design 12b ADCs because they need the linearity. Your technique would improve the SNR due to quantization error, but it wouldn't do anything to improve the SFDR because the linearity of the ADC wouldn't change.
It's a good idea, though, and is used in practice to reduce quantization and thermal noise limitations.