Corpus ID: 221996054

Projection-Free Adaptive Gradients for Large-Scale Optimization

  title={Projection-Free Adaptive Gradients for Large-Scale Optimization},
  author={Cyrille W. Combettes and C. Spiegel and Sebastian Pokutta},
  • Cyrille W. Combettes, C. Spiegel, Sebastian Pokutta
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • The complexity in large-scale optimization can lie in both handling the objective function and handling the constraint set. In this respect, stochastic Frank-Wolfe algorithms occupy a unique position as they alleviate both computational burdens, by querying only approximate first-order information from the objective and by maintaining feasibility of the iterates without using projections. In this paper, we improve the quality of their first-order information by blending in adaptive gradients… CONTINUE READING
    2 Citations

    Figures and Tables from this paper.


    Complexities in Projection-Free Stochastic Non-convex Minimization
    • 17
    • Highly Influential
    • PDF
    Adaptive Subgradient Methods for Online Learning and Stochastic Optimization
    • 6,222
    • Highly Influential
    • PDF
    Linear Convergence with Condition Number Independent Access of Full Gradients
    • 104
    • PDF
    Variance-Reduced and Projection-Free Stochastic Optimization
    • 106
    • Highly Influential
    • PDF
    Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization
    • M. Jaggi
    • Mathematics, Computer Science
    • ICML
    • 2013
    • 773
    • Highly Influential
    • PDF
    Stochastic Frank-Wolfe for Constrained Finite-Sum Minimization
    • 3
    • Highly Influential
    • PDF
    Boosting Frank-Wolfe by Chasing Gradients
    • 5
    • PDF
    Momentum-Based Variance Reduction in Non-Convex SGD
    • 38
    • PDF
    Efficient Projection-Free Online Methods with Stochastic Recursive Gradient
    • 6
    • Highly Influential
    • PDF
    Adaptive Bound Optimization for Online Convex Optimization
    • 191
    • PDF