Corpus ID: 53640519

A Subsampling Line-Search Method with Second-Order Results.

@article{Bergou2018ASL,
  title={A Subsampling Line-Search Method with Second-Order Results.},
  author={E. Bergou and Y. Diouane and V. Kunc and V. Kungurtsev and C. Royer},
  journal={arXiv: Optimization and Control},
  year={2018}
}
In many contemporary optimization problems such as those arising in machine learning, it can be computationally challenging or even infeasible to evaluate an entire function or its derivatives. This motivates the use of stochastic algorithms that sample problem data, which can jeopardize the guarantees obtained through classical globalization techniques in optimization such as a trust region or a line search. Using subsampled function values is particularly challenging for the latter strategy… Expand
Gradient-only line searches: An Alternative to Probabilistic Line Searches
Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
Discriminative Bayesian Filtering Lends Momentum to the Stochastic Newton Method for Minimizing Log-Convex Functions
Gradient-only line searches to automatically determine learning rates for a variety of stochastic training algorithms
...
1
2
...

References

SHOWING 1-10 OF 58 REFERENCES
Second-Order Optimization for Non-Convex Machine Learning: An Empirical Study
Complexity Analysis of Second-Order Line-Search Algorithms for Smooth Nonconvex Optimization
Sub-sampled Cubic Regularization for Non-convex Optimization
A Progressive Batching L-BFGS Method for Machine Learning
Exact and Inexact Subsampled Newton Methods for Optimization
Adam: A Method for Stochastic Optimization
...
1
2
3
4
5
...