The Designer's Guide Community
Forum
Welcome, Guest. Please Login or Register. Please follow the Forum guidelines.
Mar 28th, 2024, 4:07am
Pages: 1
Send Topic Print
How to implement large delay? (Read 11327 times)
BackerShu
Community Member
***
Offline



Posts: 64

How to implement large delay?
Jan 09th, 2013, 11:11pm
 
Hello all,

What kind of circuit is usually used to implement a large delay range, for instance from 200ps to 200ns? (Assume clock frequency of 10MHz to 10GHz, corresponding to 200ns to 200ps (two cycles), are available. )

The possibilities I am thinking now:
1. Inverters. Assume each inverter has 100ps delay. 2000 of them are needed at maximum. Doesn't sounds feasible to me.

2. RC delay cells. Very big R and/or C are needed. Similar problem as in 1.

3. Switched-capacitor circuit. Switched-capacitor circuit seems to be a way to implement large R and/or C here. What kind of switched-capacitor delay circuit do you think well fit this situation?

4. Others?


Any comments will be appreciated!
 

Back to top
 
 
View Profile WWW   IP Logged
raja.cedt
Senior Fellow
******
Offline



Posts: 1516
Germany
Re: How to implement large delay?
Reply #1 - Jan 9th, 2013, 11:35pm
 
hello,
Please refer ring oscillator delay cells in the literature. Tuning such a large range might be tough and even though you tune with some architecture it will be very sensitive for noise (with few architectures)

Use an Inverter with bias current tuning (current steering inverter) to tune for some range (let's say 25% of your range from 200ps). Use one more inverter with some Capacitive loading at the output which supports next 25% and so on. Select through digital mean which inverter you need (not so difficult in DLL kind of architecture). Make sure you have enough overlap between 4 bands for PVT robustness.

Thanks,
raj.
Back to top
 
 
View Profile WWW raja.sekhar86   IP Logged
BackerShu
Community Member
***
Offline



Posts: 64

Re: How to implement large delay?
Reply #2 - Jan 10th, 2013, 2:41pm
 
Thanks Raja,

Yes, you're right. For large delay, delay stage is going to be more sensitive to noise.
Further, if the signal goes through the delay stage is data, more ISI will be introduced as well.

As you hinted in the post, delay cells have different sensitivity to noise; the ISI they are going to introduce for data signal may also be different. Can you kindly point me to some references on this?



Back to top
 
 
View Profile WWW   IP Logged
raja.cedt
Senior Fellow
******
Offline



Posts: 1516
Germany
Re: How to implement large delay?
Reply #3 - Jan 10th, 2013, 10:42pm
 
Dear Backer,
i think you mis-understood my reply. First where you want to use this delay?. I Though you are using in a dll or some other mean to select a particular delay, so in that case when you have sensitivity to noise (very similar to PLL where high KVCO to support large frequency range at the same time noise sensitive).

if you are interested how an amplifier or some block in the serial link effects the data ISI it's another story. It depends on system delay vs frequency (a.k.a Group delay which is quite different from the above delay), if delay is more or less constant constant across frequency band of interest then it replicate similar to input eye except change in signal rather no change in horizontal opening.

Coming to supply noise, yes it impact ISI a lot through delay modulation but again it depends on the supply noise profile, since absolute delay is not imp as far as ISI concern only ripple in delay impacts. This is the reason why people go for  regulator based delay cells or supply immune architectures like Maneatis Delay Cell.

Let me think about it in-detail and get back to you after a while.

Thanks,
raj.
Back to top
 
 
View Profile WWW raja.sekhar86   IP Logged
BackerShu
Community Member
***
Offline



Posts: 64

Re: How to implement large delay?
Reply #4 - Jan 13th, 2013, 1:02am
 
Thanks Raja,

The exact problem in my mind is shown in the attachment.
Assume clock frequency of 10MHz to 10GHz is available. This information could be a real clock signal, or some digital code mapping to this frequency range. Whatever you think would be more helpful to get a solution to implement a  200ps to 200ns (two UIpp) delay stage for DIN. Also the delay needs to scale approximately with input data rate. Specifically, at DIN=10Mb/s, delay needs to be close to 200ns, and keeps decreasing as data rate increases.
Since input of the delay stage is data signal, ISI introduced by the delay stage will influence the quality of DOUT, especially at high data rate.

Any suggestions on how to implement this delay stage?
Back to top
 

Delay.jpg
View Profile WWW   IP Logged
loose-electron
Senior Fellow
******
Offline

Best Design Tool =
Capable Designers

Posts: 1638
San Diego California
Re: How to implement large delay?
Reply #5 - Jan 16th, 2013, 8:33pm
 
BackerShu wrote on Jan 13th, 2013, 1:02am:
Thanks Raja,

The exact problem in my mind is shown in the attachment.
Assume clock frequency of 10MHz to 10GHz is available. This information could be a real clock signal, or some digital code mapping to this frequency range. Whatever you think would be more helpful to get a solution to implement a  200ps to 200ns (two UIpp) delay stage for DIN. Also the delay needs to scale approximately with input data rate. Specifically, at DIN=10Mb/s, delay needs to be close to 200ns, and keeps decreasing as data rate increases.
Since input of the delay stage is data signal, ISI introduced by the delay stage will influence the quality of DOUT, especially at high data rate.

Any suggestions on how to implement this delay stage?


Your problem has such a huge variance in the terms that its not defined well enough to define an efficient solution.

An inefficient solution is availalble in terms of a huge memory block and quantization in the delay signal and tossing it into memory.

Or, if its something like a ring oscillator limit the range of tuning and then divide down. But that does not seem to be what you want. You seem to want a delay pipeline that delays a signal that is analog in nature.

If it is truly a digital signal, why don't you make it binary and stream it through a FIFO buffer with variable delay controls?

What's the purpose of this magic box?
Back to top
 
 

Jerry Twomey
www.effectiveelectrons.com
Read My Electronic Design Column Here
Contract IC-PCB-System Design - Analog, Mixed Signal, RF & Medical
View Profile WWW   IP Logged
BackerShu
Community Member
***
Offline



Posts: 64

Re: How to implement large delay?
Reply #6 - Jan 17th, 2013, 10:15pm
 
Thanks loose-electron!

Actually, I am trying to check the design in a paper.
[1] D. Dalton, et al., “A 12.5-Mb/s to 2.7-Gb/s continuous-rate CDR with automatic frequency acquisition and data-rate readback,” IEEE Journal of Solid State Circuit, vol. 40, no. 12, pp. 2713–2725, 2005.

A figure of the CDR block diagram is shown in the attachment (DPLL.jpg). By using the D/PLL structure proposed in this paper, Jitter Transfer (JTRAN) and Jitter Tolerance (JTOL) are decoupled while using a Phase Shifter in the feed-forward path. This Phase Shifter block is exactly the magic delay stage I am talking about, which needs to provide approximately 2UIpp delay at all data rate (from 12.5Mb/s to 2.7Gb/s) to achieve good JTOL performance.

The paper didn't talk much about the circuit design of this Phase Shitfer; and the only description is shown in the attachment (PSH.jpg).
Based on the description, the delay (or RC time constant) of the Phase Shifter is adjusted according to data rate.
I am not sure how they control the RC time constant exactly. Even if it can be implemented this way, I still feel difficult to control the delay range of the delay line to be 2UIpp for all the data rate range.

In addition, using differential cells to implement delay is very power consuming (power for this paper is 235mA@3.3V). I guess the paper employed the differential cells because of ISI requirement.

Still, I am thinking whether there are some other ways to implement this phase shifter more power efficiently, either in Analog CDR structure like this paper, or in Digital CDR structure meaning phase detector is replaced by bang-bang phase detector and loop filter is replaced by accumulator. Yes, input data is still NRZ data, which is analog in nature.

Any suggestions?
(Just came to know that I can only post one attachment at a time. So the description for the Phase Shifter (magic delay stage) in the paper (PSH.jpg) is posted next. Sorry for inconvenience.  )


Back to top
 

DPLL.jpg
View Profile WWW   IP Logged
BackerShu
Community Member
***
Offline



Posts: 64

Re: How to implement large delay?
Reply #7 - Jan 17th, 2013, 10:16pm
 
Here is the circuit design description in this paper, no schematic is shown in the paper.
Back to top
 

PSH.jpg
View Profile WWW   IP Logged
loose-electron
Senior Fellow
******
Offline

Best Design Tool =
Capable Designers

Posts: 1638
San Diego California
Re: How to implement large delay?
Reply #8 - Feb 25th, 2013, 2:31pm
 
there are many ways to implement delay, but they all go back to RC methods.

If you try to use ground referenced methods and push the individual delays up you will probably end up with a analog type delay that introduces a lot of phase jitter in the delay.

I would stick with the differential type structures used in PLL-DLL systems, but thats just an opinion.
Back to top
 
 

Jerry Twomey
www.effectiveelectrons.com
Read My Electronic Design Column Here
Contract IC-PCB-System Design - Analog, Mixed Signal, RF & Medical
View Profile WWW   IP Logged
Pages: 1
Send Topic Print
Copyright 2002-2024 Designer’s Guide Consulting, Inc. Designer’s Guide® is a registered trademark of Designer’s Guide Consulting, Inc. All rights reserved. Send comments or questions to editor@designers-guide.org. Consider submitting a paper or model.