Towards a Discrete Newton Method with Memory for Large Scale Optimization

  title={Towards a Discrete Newton Method with Memory for Large Scale Optimization},
  author={byRichard H. Byrd and Jorge Nocedal and Ciyou Zhu},
A new method for solving large nonlinear optimization problems is outlined It attempts to combine the best properties of the discrete truncated Newton method and the limited memory BFGS method to produce an algorithm that is both economical and capable of handling ill conditioned problems The key idea is to use the curvature information generated during the computation of the discrete Newton step to improve the limited memory BFGS approximations The numerical performance of the new method is… CONTINUE READING
12 Citations
12 References
Similar Papers


Publications referenced by this paper.
Showing 1-10 of 12 references

Preconditioning of truncated-Newton methods,

  • S. G. Nash
  • SIAM Journal on Scienti c and Statistical…
  • 1985
Highly Influential
4 Excerpts

Thuente, \Line search algorithms with guaranteed su cient decrease,

  • D.J.J.J. Mor e
  • ACM Transactions on Mathematical Software
  • 1994
3 Excerpts

Lemar echal, \Some numerical experiments with variable storage quasi- Newton algorithms,

  • C.J.C. Gilbert
  • Mathematical Programming
  • 1989
1 Excerpt

User's guide for TN/TNBC: FORTRAN routines for nonlinear optimization

  • S. G. Nash
  • Report 397, Mathematical Sciences Dept., The…
  • 1984
3 Excerpts


  • J. E. Dennis
  • and R. B. Schnabel, Numerical Methods for…
  • 1983
1 Excerpt

Similar Papers

Loading similar papers…