Adaptive Two-Point Stepsize Gradient Algorithm

@article{Dai2001AdaptiveTS,
  title={Adaptive Two-Point Stepsize Gradient Algorithm},
  author={Yu-Hong Dai and Hongchao Zhang},
  journal={Numerical Algorithms},
  year={2001},
  volume={27},
  pages={377-385}
}
Combined with the nonmonotone line search, the two-point stepsize gradient method has successfully been applied for large-scale unconstrained optimization. However, the numerical performances of the algorithm heavily depend on M, one of the parameters in the nonmonotone line search, even for ill-conditioned problems. This paper proposes an adaptive nonmonotone line search. The two-point stepsize gradient method is shown to be globally convergent with this adaptive nonmonotone line search… CONTINUE READING

Citations

Publications citing this paper.
Showing 1-10 of 31 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 17 references

A nonmonotone line search technique for Newton’s method

  • L. Grippo, F. Lampariello, S. Lucidi
  • SIAM J. Numer. Anal. 23
  • 1986
Highly Influential
4 Excerpts

On the nonmonotone line search, 2000 (accepted by JOTA)

  • Y. H. Dai
  • 2000
2 Excerpts

Estimation of the optical constants and the thickness of thin films using unconstrained optimization

  • E. G. Birgin, I. Chambouleyron, J. M. Martínez
  • J. Comput. Phys. 151
  • 1999
1 Excerpt

Minimization algorithms based on supervisor and searcher co-operation: I – faster and robust gradient algorithms for minimization problems with stronger noises

  • W. B. Liu, Y. H. Dai
  • 1999
1 Excerpt

R-linear convergence of the Barzilai and Borwein gradient method

  • Y. H. Dai, L. Z. Liao
  • IMA J. Numer. Anal
  • 1999
2 Excerpts

Similar Papers

Loading similar papers…