Performance of linear interpolation-based MIMO detection for MIMO-OFDM systems

Abstract

Detection of MIMO signals presents a formidable computation challenge due to the inherent complexity incurred by matrix-type wireless fading channels. When combined with OFDM, the MIMO detection over frequency-selective fading channels is divided into that over each flat-fading subchannel. Since this is usually done on per subchannel basis, it implies significant hardware cost. This work proposes the use of linear interpolation to exploit the correlation among adjacent subchannels. Since the weight matrices on non-pilot subchannels are linearly interpolated from those of pilot subchannels, the computation overhead is greatly reduced. This is particularly effective when the subchannel bandwidth is much smaller than the coherence bandwidth of the channel. Simulation using MMSE MIMO detection shows that, by keeping the product of interpolation factor and normalized RMS delay spread within a certain limit, linear interpolation can result in minimum degradation of system performance up to a practical SNR level of 20 dB to 25 dB.

DOI: 10.1109/WCNC.2004.1311320

8 Figures and Tables

Cite this paper

@article{Wang2004PerformanceOL, title={Performance of linear interpolation-based MIMO detection for MIMO-OFDM systems}, author={Jingming Wang and Babak Daneshrad}, journal={2004 IEEE Wireless Communications and Networking Conference (IEEE Cat. No.04TH8733)}, year={2004}, volume={2}, pages={981-986 Vol.2} }