Learn More
Numerous variable step-size normalized least mean-square (VSS-NLMS) algorithms have been derived to solve the dilemma of fast convergence rate or low excess mean-square error in the past two decades. This paper proposes a new, easy to implement, nonparametric VSS-NLMS algorithm that employs the mean-square error and the estimated system noise power to(More)
Least-mean-square (LMS) and block LMS (BLMS) adaptive filters are generally believed to have similar step-size bounds for convergence. Similarly, convergence analyses of frequency-domain block LMS (FBLMS) adaptive filters have suggested that they have very restrictive convergence bounds. In this letter, we revisit Feuer's work and reveal a much larger(More)
ADSTRACT This paper considers an extended recursive least squares (RLS) adaptive bilinear predictor. It is shown that the extended RLS adaptive bilinear predictor is guaranteed to be stable in the sense that the time average of the squared a-posteriori prediction error signal is bounded whenever the input signal is bounded in the same sense. It also shows(More)