aaron_do
|
Hi all,
I recently designed an RF front end where for the first time, the circuit was packaged and soldered onto a PCB for testing. The chip is functioning but the sensitivity is significantly poorer than expected. I believe it is because of the input matching.
The relection coefficient is vastly (very, very, very) different from what I expect. Furthermore, if I calibrate my network analyzer up to the PCB SMA connector it is vastly different fom if I calibrate up to the SMA coaxial cable.
I tried implementing a rudimentary LC matching network on the board, but for various reasons it has been extremely troublesome trying to match the load. Furthermore, i'm not sure if it is even better to get a good match. Here's why:
1) the network in between the device and the sig gen contributes several dB of loss (maybe 5 dB including a balun, the coax cables, and other board components). This loss can be treated as a different load from the desired load (the chip).
2) Based on the above, if I do impedance matching, how do I know that maximum power delivered by the sig gen results in the maximum power delivered to the chip?
So here is my question. Can anybody give me some practical advice on what I can do at this point. The only thing I can think of is to just use trial and error until I get the best possible network.
Also for the next time I do this, does anybody have any recommendations? I suppose I need to model my PCB circuits, boding wire, and transmission line properly and take it into account in the design.
thanks, Aaron
|