In this brief, we developed a detailed Simulink model that emulates the radio frequency and digital base band of a six-port receiver. In contrast to other results published in the literature, we not only provide a controlled continuous bias to the diodes. We propose a novel algorithm that reduces the error vector magnitude (EVM) by adaptively controlling the diode bias point. We will also show that when the algorithm is turned on in the model, the EVM becomes less sensitive to variations in the diode bias conditions. Another key feature of optimum diode bias control is that the local oscillator (LO) power requirements decrease and that the EVM is not sensitive to LO power variation. Results presented here show that for an LO power variation of more than 10 dB, we notice no EVM variation.

Additional Metadata
Keywords Bit Error Rate (BER), Error Vector Magnitude (EVM), Inter- Modulation Distortion, Local Oscillator (LO), Noise Figure, Software Defined Radio, Zero IF receivers
Persistent URL dx.doi.org/10.1109/TCSII.2014.2362792
Journal IEEE Transactions on Circuits and Systems II: Express Briefs
Citation
Lima, J.A. (Jose Augusto), Rogers, J, & Amaya, R. (2015). Optimization of EVM through diode bias control using a blind algorithm applied to multiport receivers. IEEE Transactions on Circuits and Systems II: Express Briefs, 62(3), 286–290. doi:10.1109/TCSII.2014.2362792