• Corpus ID: 235247881

Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions

@inproceedings{Carderera2021SimpleSA,
  title={Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions},
  author={Alejandro Carderera and Mathieu Besançon and Sebastian Pokutta},
  booktitle={NeurIPS},
  year={2021}
}
Generalized self-concordance is a key property present in the objective function of many important learning problems. We establish the convergence rate of a simple Frank-Wolfe variant that uses the open-loop step size strategy γt = 2/(t + 2), obtaining a O(1/t) convergence rate for this class of functions in terms of primal gap and Frank-Wolfe gap, where t is the iteration count. This avoids the use of second-order information or the need to estimate local smoothness parameters of previous work… 
Generalized self-concordant analysis of Frank–Wolfe algorithms
TLDR
This paper closes the apparent gap in the literature by developing provably convergent Frank–Wolfe algorithms with standard O(1/k) convergence rate guarantees, and shows how these sublinearly convergent methods can be accelerated to yield linearly Convergent projection-free methods.
Frank-Wolfe-based Algorithms for Approximating Tyler's M-estimator
TLDR
This work proposes, to the best of the authors' knowledge, the first Frank-Wolfe-based algorithms for computing Tyler’s M-estimator, and all three variants are parameter-free and use adaptive step-sizes.
Interpretable Neural Networks with Frank-Wolfe: Sparse Relevance Maps and Relevance Orderings
TLDR
A novel multi-rate as well as a relevanceordering variant of RDE that both empirically outperform standard RDE and other baseline methods in a well-established comparison test.

References

SHOWING 1-10 OF 47 REFERENCES
Generalized self-concordant functions: a recipe for Newton-type methods
TLDR
The proposed theory provides a mathematical tool to analyze both local and global convergence of Newton-type methods without imposing unverifiable assumptions as long as the underlying functionals fall into the class of generalized self-concordant functions.
Self-concordant analysis for logistic regression
TLDR
This paper uses and extends tools from the convex optimization literature, namely self-concordant functions, to provide simple extensions of theoretical results for the square loss to the logistic loss, showing that new results for binary classification through logistic regression can be easily derived from corresponding results for least-squares regression.
FrankWolfe.jl: A High-Performance and Flexible Toolbox for Frank–Wolfe Algorithms and Conditional Gradients
TLDR
FrankWolfe.jl is an open-source implementation of several popular Frank–Wolfe and conditional gradients variants for first-order constrained optimization, allowing for easy extension and relying on few assumptions regarding the user-provided functions.
A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization
We demonstrate how to scalably solve a class of constrained self-concordant minimization problems using linear minimization oracles (LMO) over the constraint set. We prove that the number of LMO
Self-concordant analysis of Frank-Wolfe algorithms
TLDR
The theory of SC functions is used to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations, and if the problem admits a stronger local linear minimization oracle, a novel FW method with linear convergence rate for SC functions.
Linearly Convergent Frank-Wolfe with Backtracking Line-Search.
TLDR
Variants of Away-steps and Pairwise FW that lift both restrictions simultaneously and inherit all the favorable convergence properties of the exact line-search version, including linear convergence for strongly convex functions over polytopes are proposed.
Some comments on Wolfe's ‘away step’
We give a detailed proof, under slightly weaker conditions on the objective function, that a modified Frank-Wolfe algorithm based on Wolfe's ‘away step’ strategy can achieve geometric convergence,
Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization
TLDR
A new general framework for convex optimization over matrix factorizations, where every Frank-Wolfe iteration will consist of a low-rank update, is presented, and the broad application areas of this approach are discussed.
2020b] (denoted by GSC-FW and LLOO in the figures
  • 2020
...
...