The Designer's Guide Community Forum
https://designers-guide.org/forum/YaBB.pl
Design >> Mixed-Signal Design >> Oversampling to improve resolution
https://designers-guide.org/forum/YaBB.pl?num=1389002863

Message started by aaron_do on Jan 6th, 2014, 2:07am

Title: Oversampling to improve resolution
Post by aaron_do on Jan 6th, 2014, 2:07am

Hi all,


taking a pipelined ADC as a reference, but really this question is more general, can we always trade resolution for speed? What I mean is, suppose I need a 12b 100 MS/s ADC, can I design a 10b 400 MS/s ADC and then average out samples in the digital domain?

My doubt is that my 10b 400 MS/s ADC is only 10b accurate, but I guess that shouldn't be an issue since sigma-delta ADCs work perfectly fine...

The second question is, if this kind of oversampling is possible, why do people still design 12b 100MS/s ADCs? For a pipelined ADC, I believe you run into a lot of problems above around 10b resolution.

All opinions are welcome.


thanks,
Aaron

Title: Re: Oversampling to improve resolution
Post by carlgrace on Jan 6th, 2014, 9:46am


aaron_do wrote on Jan 6th, 2014, 2:07am:
Hi all,


taking a pipelined ADC as a reference, but really this question is more general, can we always trade resolution for speed? What I mean is, suppose I need a 12b 100 MS/s ADC, can I design a 10b 400 MS/s ADC and then average out samples in the digital domain?

My doubt is that my 10b 400 MS/s ADC is only 10b accurate, but I guess that shouldn't be an issue since sigma-delta ADCs work perfectly fine...

The second question is, if this kind of oversampling is possible, why do people still design 12b 100MS/s ADCs? For a pipelined ADC, I believe you run into a lot of problems above around 10b resolution.

All opinions are welcome.


thanks,
Aaron


Hi Aaron,

Yes, in general you may directly trade resolution for speed in any Nyquist-rate ADC.  It is done very often in industry to relax the requirements on anti-alias filtering.

However, you're trading resolution for speed, not accuracy for speed, so be careful.  If you have a 10b 400 MS/s ADC you will lower your quantization noise to a 12-bit level if you decimate from 400 MS/s to 100 MS/s.  However, like you said, if you had a true 10-bit ADC (+/- 0.5 LSB INL/DNL) then you would now have +/- 2.0 LSB INL/DNL.  

Also, in some technologies it is much easier to design more accuracy than more speed (once you hit the technology's limit).

The sigma-delta ADC has really high linearity because it uses a low-bit (usually one-bit) quantizer that is easy to make linear.  Get 10-bit raw linearity is hard, as you know.

People design 12b ADCs because they need the linearity.  Your technique would improve the SNR due to quantization error, but it wouldn't do anything to improve the SFDR because the linearity of the ADC wouldn't change.

It's a good idea, though, and is used in practice to reduce quantization and thermal noise limitations.

Title: Re: Oversampling to improve resolution
Post by weber8722 on Feb 27th, 2015, 10:34am

Hi,

one further comment: Using an ideal ADC with e.g. 10bit at possible higher fclk does NOT automatically give 12-bit solution, even for any high fclk.
The simplest case is looking to a fix DC input: Here the 10-bit ADC would always give out the SAME 10-bit code. How can you ever improve this, and get e.g. 12 Bits?
Actually only in the presence of some noise well above few LSB you can improve by oversampling.

Bye Stephan

The Designer's Guide Community Forum » Powered by YaBB 2.2.2!
YaBB © 2000-2008. All Rights Reserved.