daniB wrote on Mar 3rd, 2010, 11:33am:Hi,
I have an input with the following specs:
Input Range 0-5V
To be measured with 10bit resolution (precision) on a ADC that has VDD=3.3V
Does anyone know a circiut that adjusts the 5V range to the 3.3V range without the need of 0.01% resistors or manualy trimming the input gain?
Many thanks
Daniel
Hi,
What is the input range of the ADC. It is probably not 0V-3.3V. You would need to adjust to that level. Most commercial ADCs come with built-in programmable gain/attenuation upfront, which may allow you to adjust the signal range. Of course, it is doubtful if the on-chip resistors used for the internal gain settings are matched to 0.01% as you wish. That would most likely not be the case.
Some ADCs are designed to accomodate a larger signal range than supply. It all depends on the part you are using.
Otherwise, if you really need to build your own attenuator upfront, which will mean manually trimming the input gain if you need that precision in gain as well. I would however be a little surprised if you were to tell me that the gain of your 10b ADC is controlled to a 0.01% precision.
Regards,
Vivek