pkd
Junior Member
![* *](https://designers-guide.org/forum/Templates/Forum/default/starblue.gif)
Offline
Posts: 25
India
|
Hi! I have read from some books that, phase noise causing jitter is modeled as 1/f (pink) Gaussian noise. Suppose you consider that ϕ is a random variable having Gaussian pink distribution. Then, the sine wave plagued with phase noise is given by y(t)= Asin(ωt+ϕ). Implementing the above process in a digital computer, typically it would be done as follows: y_n=y(nT_s )= Asin(2πf/f_s n+ϕ), where n=0,1,2,…, and f_s=1/T_s . The question is how often should ϕ be sampled. Should ϕ by ϕ_n, meaning that ϕ is sampled at every step of n, or is it sampled once every cycle? But the problem is doing this way; the waveform becomes discontinuous at the edges of the cycles.
Our question is more clearly explained in the attached document:
The code used to generate the above plot is given below: fs=100; % [color=#990000]100 samples per cycle y=zeros(10*fs,1); for n=1:10 % sampling phase noise once every cycle y((1:fs+1)+(n-1)*fs)=sin(2*pi*[0:1/fs:1]+randn); % randn is white Gaussian random variable end[/color]
|