Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions
@inproceedings{Carderera2021SimpleSA, title={Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions}, author={Alejandro Carderera and Mathieu Besançon and Sebastian Pokutta}, booktitle={NeurIPS}, year={2021} }
Generalized self-concordance is a key property present in the objective function of many important learning problems. We establish the convergence rate of a simple Frank-Wolfe variant that uses the open-loop step size strategy γt = 2/(t + 2), obtaining a O(1/t) convergence rate for this class of functions in terms of primal gap and Frank-Wolfe gap, where t is the iteration count. This avoids the use of second-order information or the need to estimate local smoothness parameters of previous work…
Figures and Tables from this paper
3 Citations
Generalized self-concordant analysis of Frank–Wolfe algorithms
- Computer Science, MathematicsMathematical Programming
- 2022
This paper closes the apparent gap in the literature by developing provably convergent Frank–Wolfe algorithms with standard O(1/k) convergence rate guarantees, and shows how these sublinearly convergent methods can be accelerated to yield linearly Convergent projection-free methods.
Frank-Wolfe-based Algorithms for Approximating Tyler's M-estimator
- Computer ScienceArXiv
- 2022
This work proposes, to the best of the authors' knowledge, the first Frank-Wolfe-based algorithms for computing Tyler’s M-estimator, and all three variants are parameter-free and use adaptive step-sizes.
Interpretable Neural Networks with Frank-Wolfe: Sparse Relevance Maps and Relevance Orderings
- Computer ScienceICML
- 2022
A novel multi-rate as well as a relevanceordering variant of RDE that both empirically outperform standard RDE and other baseline methods in a well-established comparison test.
References
SHOWING 1-10 OF 47 REFERENCES
Generalized self-concordant functions: a recipe for Newton-type methods
- Mathematics, Computer ScienceMath. Program.
- 2019
The proposed theory provides a mathematical tool to analyze both local and global convergence of Newton-type methods without imposing unverifiable assumptions as long as the underlying functionals fall into the class of generalized self-concordant functions.
Self-concordant analysis for logistic regression
- Computer ScienceArXiv
- 2009
This paper uses and extends tools from the convex optimization literature, namely self-concordant functions, to provide simple extensions of theoretical results for the square loss to the logistic loss, showing that new results for binary classification through logistic regression can be easily derived from corresponding results for least-squares regression.
FrankWolfe.jl: A High-Performance and Flexible Toolbox for Frank–Wolfe Algorithms and Conditional Gradients
- Computer ScienceINFORMS Journal on Computing
- 2022
FrankWolfe.jl is an open-source implementation of several popular Frank–Wolfe and conditional gradients variants for first-order constrained optimization, allowing for easy extension and relying on few assumptions regarding the user-provided functions.
A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
- Computer ScienceJ. Glob. Optim.
- 2022
We demonstrate how to scalably solve a class of constrained self-concordant minimization problems using linear minimization oracles (LMO) over the constraint set. We prove that the number of LMO…
Self-concordant analysis of Frank-Wolfe algorithms
- Computer Science, MathematicsICML
- 2020
The theory of SC functions is used to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations, and if the problem admits a stronger local linear minimization oracle, a novel FW method with linear convergence rate for SC functions.
Linearly Convergent Frank-Wolfe with Backtracking Line-Search.
- Computer Science
- 2018
Variants of Away-steps and Pairwise FW that lift both restrictions simultaneously and inherit all the favorable convergence properties of the exact line-search version, including linear convergence for strongly convex functions over polytopes are proposed.
Some comments on Wolfe's ‘away step’
- MathematicsMath. Program.
- 1986
We give a detailed proof, under slightly weaker conditions on the objective function, that a modified Frank-Wolfe algorithm based on Wolfe's ‘away step’ strategy can achieve geometric convergence,…
Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization
- Computer ScienceICML
- 2013
A new general framework for convex optimization over matrix factorizations, where every Frank-Wolfe iteration will consist of a low-rank update, is presented, and the broad application areas of this approach are discussed.
2020b] (denoted by GSC-FW and LLOO in the figures
- 2020