lhlbluesky_lhl
|
in my case, first i select center frequency f0=20khz, noise @50khz has a pk-pk amplitude of 1uA (sine wave), signal amplitude is 1nA~100uA, then, in simulation, i find that, because 20khz and 50khz is very close to each other, therefore, the signal (signal adding 50khz noise) at VGA input is also a sine wave with pk-pk amplitude a little smaller, and the VGA output is wrong here. in normal case, the VGA input is zero, when there is signal injecting, the input of VGA is half-negative, half-positive during one period, so VGA has a effective output pulse responding to the signal input. however, when there is 1uA 50khz noise, the input of VGA is always half-negative, half-positive during one period, and causing wrong results. in order to decrease the influence of the 50khz noise, i added some order LPF before VGA input, but because 20khz and 50khz is so close to each other, the noise attenuation @50khz is very small relative to signal @20khz (AC response), and with increasing of LPF order, signal amplitude @20khz decreases also. so, how to suppress the 50khz noise to get a correct signal at VGA input(normally zero, when there is signal injecting, the input of VGA is half-negative, half-positive during one period)? the threshold of VGA is zero in my design. thanks all.
|