- Published 1996 in IEEE Trans. Signal Processing

We show that the celebrated least-mean squares (LMS) adaptive algorithm is Ha optimal. The LMS algorithm has been long regarded as an approximate solution to either a stochastic or a deterministic least-squares problem, and it essentially amounts to updating the weight vector estimates along the direction of the instantaneous gradient of a quadratic cost function. In this paper, we show that LMS can be regarded as the exact solution to a minimization problem in its own right. Namely, we establish that it is a minimax filter: It minimizes the maximum energy gain from the disturbances to the predicted errors, whereas the closely related so-called normalized LMS algorithm minimizes the maximum energy gain from the disturbances to the filtered errors. Moreover, since these algorithms are central Ha filters, they minimize a certain exponential cost function and are thus also risk-sensitive optimal. We discuss the various implications of these results and show how they provide theoretical justification for the widely observed excellent robustness properties of the LMS filter.

@article{Hassibi1996HOO,
title={H∞ optimality of the LMS algorithm},
author={Babak Hassibi and Ali H. Sayed and Thomas Kailath},
journal={IEEE Trans. Signal Processing},
year={1996},
volume={44},
pages={267-280}
}