The Designer's Guide Community
Forum
Welcome, Guest. Please Login or Register. Please follow the Forum guidelines.
Apr 25th, 2024, 12:09pm
Pages: 1
Send Topic Print
Oversampling to improve resolution (Read 3813 times)
aaron_do
Senior Fellow
******
Offline



Posts: 1398

Oversampling to improve resolution
Jan 06th, 2014, 2:07am
 
Hi all,


taking a pipelined ADC as a reference, but really this question is more general, can we always trade resolution for speed? What I mean is, suppose I need a 12b 100 MS/s ADC, can I design a 10b 400 MS/s ADC and then average out samples in the digital domain?

My doubt is that my 10b 400 MS/s ADC is only 10b accurate, but I guess that shouldn't be an issue since sigma-delta ADCs work perfectly fine...

The second question is, if this kind of oversampling is possible, why do people still design 12b 100MS/s ADCs? For a pipelined ADC, I believe you run into a lot of problems above around 10b resolution.

All opinions are welcome.


thanks,
Aaron
Back to top
 
 

there is no energy in matter other than that received from the environment - Nikola Tesla
View Profile   IP Logged
carlgrace
Senior Member
****
Offline



Posts: 231
Berkeley, CA
Re: Oversampling to improve resolution
Reply #1 - Jan 6th, 2014, 9:46am
 
aaron_do wrote on Jan 6th, 2014, 2:07am:
Hi all,


taking a pipelined ADC as a reference, but really this question is more general, can we always trade resolution for speed? What I mean is, suppose I need a 12b 100 MS/s ADC, can I design a 10b 400 MS/s ADC and then average out samples in the digital domain?

My doubt is that my 10b 400 MS/s ADC is only 10b accurate, but I guess that shouldn't be an issue since sigma-delta ADCs work perfectly fine...

The second question is, if this kind of oversampling is possible, why do people still design 12b 100MS/s ADCs? For a pipelined ADC, I believe you run into a lot of problems above around 10b resolution.

All opinions are welcome.


thanks,
Aaron


Hi Aaron,

Yes, in general you may directly trade resolution for speed in any Nyquist-rate ADC.  It is done very often in industry to relax the requirements on anti-alias filtering.

However, you're trading resolution for speed, not accuracy for speed, so be careful.  If you have a 10b 400 MS/s ADC you will lower your quantization noise to a 12-bit level if you decimate from 400 MS/s to 100 MS/s.  However, like you said, if you had a true 10-bit ADC (+/- 0.5 LSB INL/DNL) then you would now have +/- 2.0 LSB INL/DNL.  

Also, in some technologies it is much easier to design more accuracy than more speed (once you hit the technology's limit).

The sigma-delta ADC has really high linearity because it uses a low-bit (usually one-bit) quantizer that is easy to make linear.  Get 10-bit raw linearity is hard, as you know.

People design 12b ADCs because they need the linearity.  Your technique would improve the SNR due to quantization error, but it wouldn't do anything to improve the SFDR because the linearity of the ADC wouldn't change.

It's a good idea, though, and is used in practice to reduce quantization and thermal noise limitations.
Back to top
 
 
View Profile   IP Logged
weber8722
Community Member
***
Offline



Posts: 95

Re: Oversampling to improve resolution
Reply #2 - Feb 27th, 2015, 10:34am
 
Hi,

one further comment: Using an ideal ADC with e.g. 10bit at possible higher fclk does NOT automatically give 12-bit solution, even for any high fclk.
The simplest case is looking to a fix DC input: Here the 10-bit ADC would always give out the SAME 10-bit code. How can you ever improve this, and get e.g. 12 Bits?
Actually only in the presence of some noise well above few LSB you can improve by oversampling.

Bye Stephan
Back to top
 
 
View Profile   IP Logged
Pages: 1
Send Topic Print
Copyright 2002-2024 Designer’s Guide Consulting, Inc. Designer’s Guide® is a registered trademark of Designer’s Guide Consulting, Inc. All rights reserved. Send comments or questions to editor@designers-guide.org. Consider submitting a paper or model.