Corpus ID: 9348723

QuickeNing: A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

  title={QuickeNing: A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization},
  author={Hongzhou Lin and J. Mairal and Z. Harchaoui},
We propose an approach to accelerate gradient-based optimization algorithms by giving them the ability to exploit curvature information using quasi-Newton update rules. The proposed scheme, called QuickeNing, is generic and can be applied to a large class of first-order methods such as incremental and block-coordinate algorithms; it is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing… Expand
Stochastic proximal quasi-Newton methods for non-convex composite optimization
Inexact proximal stochastic second-order methods for nonconvex composite optimization
Randomized Smoothing SVRG for Large-scale Nonsmooth Convex Optimization
  • W. Huang
  • Computer Science, Mathematics
  • ArXiv
  • 2018
SDCA-Powered Inexact Dual Augmented Lagrangian Method for Fast CRF Learning
Efficiency of minimizing compositions of convex functions and smooth maps
Structure and complexity in non-convex and non-smooth optimization
The Mechanical Property Analysis of Sonar Probe and Cable in Aviation Winch


A Stochastic Quasi-Newton Method for Large-Scale Optimization
A Universal Catalyst for First-Order Optimization
Practical inexact proximal quasi-Newton method with global complexity analysis
Descentwise inexact proximal algorithms for smooth optimization
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
  • J. Mairal
  • Computer Science, Mathematics
  • SIAM J. Optim.
  • 2015
A quasi-Newton approach to non-smooth convex optimization
Adaptive Subgradient Methods for Online Learning and Stochastic Optimization