The Designer's Guide Community Forum
https://designers-guide.org/forum/YaBB.pl
Design >> Mixed-Signal Design >> How to reduce the VDD/GND bounce and its effect in ADC design?
https://designers-guide.org/forum/YaBB.pl?num=1216022040

Message started by roland on Jul 14th, 2008, 12:54am

Title: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by roland on Jul 14th, 2008, 12:54am

In a high speed pipelined ADC design, great amount of switches causes a large peak current at the clock edge (over 50mA), which leads to great VDD/GND bounce when inductance of package is consided. Although heavyly decoupled, the bounce still seems to be serious. I've tried to add some resistor in the supply path to damp the ringing. It may have some effect, but cause IR drop as well. How to solve the problem?
 Another question is that: Should the opamps and the analog switches share the same supply path? If they share the supply path, the ringing caused by switches may affect the opamp's differential output if PSRR is not very high. If not, the ringing of the switch supply will not be synchronized with the opamp, and it will be like some "noise". How to arrange the supply?

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by loose-electron on Jul 15th, 2008, 6:01pm

important concept here - dont turn the current on and off.

Instead, steer the currents, if you arent using them in something, swap the current off into a load.

That way, with exception of capacitive induced spikes, the current load remains fairly flat.

after that, local internal capacitive decouple and reduce your power and ground impedances

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by thechopper on Jul 16th, 2008, 10:23am

Hi,

Loose electron suggestion is a good one. Here are a couple more:

The logic directly driving the switches should go to the "clean supply". Otherwise all the noise coming from the noisy supply might get capacitive coupled into the signal lines where the switches are connected to.

The rest of the logic for generating the clock signal for the switches can go to the noisy supply.

Try to use differential strucutres wherever you can. In this way circuit symmetry is maximized and most of the noise on the supply lines will be rejected as much as possible (actually one example of this "going differential" philosophy is what loose electron suggested about steering the currents rather than turning them on and off)

Hope this helps
Regards
Tosei

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by Berti on Jul 16th, 2008, 11:33pm

Hi Tosei,


Quote:
The logic directly driving the switches should go to the "clean supply".


Interesting approach. But I am not sure. Probably you just shift the problem, because
the last (usually strongest) buffer will couple into the analog (clean) supply.?

Regards

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by roland on Jul 17th, 2008, 6:54am

Hi loose-electron and Tosei
Thank u for your really good suggestions. As loose-electron has suggested, I tried to avoid turning on/off large current, and the peak current  comes mainly from the local logics for switches (nand/nor ...) and some bootstrap switches themselves,which I think might be hard to reduce. So actually the current is composed of a large but flat DC current for OTA and a  great current spike from the switches.
I've tried to split the two currents, which makes it much easier to decouple VDD/GND for the OTA. But it's still hard to make the VDD/GND for the switches clean, and as Tosei has suggested , the "dirty" power for the switches and local logics couples to the signal through parasitic caps. Simulation has proven that.
An intesting thing is that increasing the impedances for the power paths of switches rather than lowering them may help to damp the bounce.So, may it be posible to use narrow wires for those "AC" powers and wide ones for "DC" powers?  
And how much on chip decouple caps is normally use for ADCs? I've used almost 2~3nf (mos caps),is that too much?

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by thechopper on Jul 17th, 2008, 7:12pm


Berti wrote on Jul 16th, 2008, 11:33pm:
Interesting approach. But I am not sure. Probably you just shift the problem, because
the last (usually strongest) buffer will couple into the analog (clean) supply.?

Regards


Hi Berti,.

As usual, there is a trade off involved. Certainly your clean supply will get slightly dirty when you connect the gates driving the switches to it. Then the question is what is worse?
1)  getting the analog supply slightly dirtier but not enough in order to degrade the injected noise into the OTA (assuming a not too large PSRR) but keeping the (sensitive) signals being driven by the switches from the (much) dirtier digital supply OR
2) Living with the signal driven by the switches - now connected to the dirty supply - being corrupted by such noise, but keeping the OTA with a super clean supply.

In most cases I would be inclined for option 1) since it is usually the signal driven by switches (for example low noise signal coming into a SC filter) the one that suffers mostly the effect of noise. At the same time the OTA is supposed to be more immune to noise than the incoming signal. Despite the relative low PSRR the OTA might have
it will always be better than the unprotected incoming signal which is totally exposed to capacitive coupling.

Certainly there can be exceptions and eventually option #2 might be better.

Roland:

Using guard rings is another good technique for isolating noise from analog sensitive circuits. Please be aware of the role a guard ring might play: it could be used for collecting noise - in which case you should connect it to an already dirty supply line or for isolating the noise. In this case it should be connected to a clean supply.

I´m not sure about increasing the impedance of the AC power. I guess that increasing its resistance will help filter the spikes, but at some point it might hurt and I'm not totally clear in which way.

Hope this helps
Tosei

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by vivkr on Jul 20th, 2008, 11:52pm

Hi,

In addition to all the excellent suggestions presented here, I have a couple to make which I found useful in my design (although we were working more on higher accuracy and not very high speed):

1. Use same supply, but make a star-connection at the pad.

2. Connect some bypass capacitance from VDD-VSS, directly near the blocks which are causing the large noise. So basically, near your switch matrix, you can tie the VDD-VSS with a cap to reduce the swing.

3. I split the VDD-VSS for switches into 2 further components, one which goes to the bulks (the switch itself uses VDD-VSS mainly for biasing the bulks), and the other for the switch driving gates.

4. Make sure that the layout is well made to prevent cross-coupling of these various supply domains. Also try and run signals in as high a metal layer as possible. This will minimize both the parasitic loading on these lines as well as the noise injection into substrate from noisy lines.

5. Shield well. Use a clean reference signal to protect all sensitive analog lines from the substrate and from noisy signals. Run these in higher metals also. Avoid running noisy and sensitive signals parallel to each other.

6. The above measures will run into area and a lot of layout effort, but if you are using 2nF-3nF bypass cap and still not getting the bounce down, then it means that you have some significant coupling between the noisy and sensitive signals somwhere inside your block. Proper placement and routing will allow you to work with a lot less on-chip cap (unless you have a huge device and no option of using an off-chip cap to supplement the on-chip one).

Regards
Vivek

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by ci on Aug 25th, 2008, 10:38am

These are great suggestions that may solve your problem.  If you decide to implement shunting guard rings as Tosei suggested, consider the impedance of the shunting path to the biasing supply and the impedance of the biasing supply itself.  It is always good to simulate a frequency characteristic of the shunting path impedance, and compare the result with the frequency spectrum of the switching noise.  This way you can make sure that the guard ring “really” shunts the dominant components of the noise.  Depending on the fabrication technology, you may also consider high-resistance isolation guard rings.  These don’t need to be electrically biased.  A good source to learn about these issues is the book at www.noisecoupling.com

ci

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by jerryzhao on Sep 2nd, 2008, 11:16pm

"A good source to learn about these issues is the book at www.noisecoupling.com"

Hi Ci,
I can not link the www.noisecoupling.com.
Could you share the book's name? :)

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by huber on Sep 11th, 2008, 7:36am

Are you damping your bypass capacitors?  You should include a series R to avoid a high-Q supply network.

http://www.designers-guide.org/Design/bypassing.pdf

The recipe for supply design above is hard to follow in ICs (more applicable for board design) because usually you can't get a big enough damping capacitor.  But the ideas are useful.

You're talking about noise on VDD-VDD, right?  Not VDD and VSS moving together?  The latter can be very hard to suppress.

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by huber on Sep 11th, 2008, 7:38am

Whoops, typo.  I meant VDD-VSS, not VDD-VDD.

Title: Re: How to reduce the VDD/GND bounce and its effect in ADC design?
Post by rf-design on Sep 22nd, 2008, 3:21pm

I had a similar issue some years ago. Decoupling with caps did not help because it shift the resonant frequency of the supply impedance to lower values but increase the Q. So the noise gets higher and last longer.

I found to use a direct injection of a current depending on the time derative of the supply voltage in the bias circuit for some constant current circuits which where less sensitive to worsening PSSR of the bias.

The obvious natural effect is that of a miller capacitance but the injection point within the bias bandgap does an additional shift of 90 degree. So in effect makes a strong resistive damping w/o worsening the low frequency PSRR.

The Designer's Guide Community Forum » Powered by YaBB 2.2.2!
YaBB © 2000-2008. All Rights Reserved.