H∞ optimality of the LMS algorithm

Abstract

We show that the celebrated least-mean squares (LMS) adaptive algorithm is Ha optimal. The LMS algorithm has been long regarded as an approximate solution to either a stochastic or a deterministic least-squares problem, and it essentially amounts to updating the weight vector estimates along the direction of the instantaneous gradient of a quadratic cost function. In this paper, we show that LMS can be regarded as the exact solution to a minimization problem in its own right. Namely, we establish that it is a minimax filter: It minimizes the maximum energy gain from the disturbances to the predicted errors, whereas the closely related so-called normalized LMS algorithm minimizes the maximum energy gain from the disturbances to the filtered errors. Moreover, since these algorithms are central Ha filters, they minimize a certain exponential cost function and are thus also risk-sensitive optimal. We discuss the various implications of these results and show how they provide theoretical justification for the widely observed excellent robustness properties of the LMS filter.

DOI: 10.1109/78.485923

Extracted Key Phrases

4 Figures and Tables

Statistics

051015'97'99'01'03'05'07'09'11'13'15'17
Citations per Year

105 Citations

Semantic Scholar estimates that this publication has 105 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Hassibi1996HOO, title={H∞ optimality of the LMS algorithm}, author={Babak Hassibi and Ali H. Sayed and Thomas Kailath}, journal={IEEE Trans. Signal Processing}, year={1996}, volume={44}, pages={267-280} }