• Publications
  • Influence
Numerical Optimization
Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization. It responds to the growing interest in optimization inExpand
On the limited memory BFGS method for large scale optimization
TLDR
The numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence, and the convergence properties are studied to prove global convergence on uniformly convex problems. Expand
Optimization Methods for Large-Scale Machine Learning
TLDR
A major theme of this study is that large-scale machine learning represents a distinctive setting in which the stochastic gradient method has traditionally played a central role while conventional gradient-based nonlinear optimization techniques typically falter, leading to a discussion about the next generation of optimization methods for large- scale machine learning. Expand
A Limited Memory Algorithm for Bound Constrained Optimization
An algorithm for solving large nonlinear optimization problems with simple bounds is described. It is based on the gradient projection method and uses a limited memory BFGS matrix to approximate theExpand
Numerical Optimization (Springer Series in Operations Research and Financial Engineering)
TLDR
Numerical optimization presents a graduate text, in continuous presents, that talks extensively about algorithmic performance and thinking, and about mathematical optimization in understanding of initiative. Expand
Updating Quasi-Newton Matrices With Limited Storage
We study how to use the BFGS quasi-Newton matrices to precondition minimization methods for problems where the storage is critical. We give an update formula which generates matrices usingExpand
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
TLDR
This work investigates the cause for this generalization drop in the large-batch regime and presents numerical evidence that supports the view that large- batch methods tend to converge to sharp minimizers of the training and testing functions - and as is well known, sharp minima lead to poorer generalization. Expand
Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization
L-BFGS-B is a limited-memory algorithm for solving large nonlinear optimization problems subject to simple bounds on the variables. It is intended for problems in which information on the HessianExpand
Global Convergence Properties of Conjugate Gradient Methods for Optimization
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergentExpand
A limited-memory algorithm for bound-constrained optimization
An algorithm for solving large nonlinear optimization problems with simple bounds is described. It is based on the gradient projection method and uses a limited-memory BFGS matrix to approximate theExpand
...
1
2
3
4
5
...