filipe
Junior Member
Offline
Posts: 22
Brasil
|
Hi, I'm designing a 2nd order fully differential sigma delta modulator using SC techniques, and it will be used in a 16bits ADC. I'm having a serious problem in terms of harmonic distortion, which is around 80dB (the 2nd harmonic is -80dB, and the 3rd harmonic is -95dB). Simulation show that the main distortion are located at the first sampling phase (when sw1a, sw1b, sw2a e sw2b are on at the figure). I conclude when the simulation is done with only the 4 sw mentioned were changed by reals switches. The switches are complementary MOS transistors (PMOS = 6/0.6 and NMOS = 2/0.6). The lengths of the transistors are minimal. When the widths (W) of the transistors are increased, the 2nd tone does not decrease (I thought that if the Ron decreases, the 2nd harmonic should decrease too, although the non linearity is the cause). Also, I notice that the size of the buffer (an inverter gate), that control the switches, has a big influence in the 2nd harmonic. I tried to use dummy transistors with the switches, but the 2nd does not decrease again. So, what could I do to reduce the harmonic distortion??? What is the influence of the logic buffer that controls the switches??? How can I choose the transistor’ size of the logic buffer (gate inverter)??? The transistor dummy should be a solution??? Please, enlighten me. If one has a suggestion, or a good reference to read, tell me. Best Regards Filipe
|