An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost

@article{Zhigljavsky2013AnAO,
  title={An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost},
  author={Anatoly A. Zhigljavsky and Luc Pronzato and Elena Bukina},
  journal={Optimization Letters},
  year={2013},
  volume={7},
  pages={1047-1059}
}
We consider gradient algorithms for minimizing a quadratic function in R with large n. We suggest a particular sequence of step-lengthes and demonstrate that the resulting gradient algorithm has a convergence rate comparable with that of Conjugate Gradients and other methods based on the use of Krylov spaces. When the problem is large and sparse, the proposed algorithm can be more efficient than the Conjugate Gradient algorithm in terms of computational cost, as k iterations of the proposed… CONTINUE READING

Citations

Publications citing this paper.

References

Publications referenced by this paper.
Showing 1-10 of 13 references

Two-point step size gradient methods

J. Barzilai, J. Borwein
IMA J. Numer. Anal. 8, 141–148 • 1988
View 5 Excerpts
Highly Influenced

Practical use of polynomial preconditionings for the conjugate gradient method

Y. Saad
SIAM J. Sci. Stat. Comp. 6(4), 865–881 • 1985
View 1 Excerpt

The block preconditioned conjugate gradient method on vector computers

G. Meurant
BIT 24, 623–633 • 1984
View 1 Excerpt

Gaps and steps for the sequence nθ mod 1

B. Slater
Math.Proc.Camb.Phil.Soc. 63, 1115–1123 • 1967
View 1 Excerpt

Similar Papers

Loading similar papers…