Skip to search formSkip to main contentSkip to account menu

Frank–Wolfe algorithm

Known as: Conditional gradient method, Frank-Wolfe, Frank-Wolfe algorithm 
The Frank–Wolfe algorithm is an iterative first-order optimization algorithm for constrained convex optimization. Also known as the conditional… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2019
Highly Cited
2019
One of the beauties of the projected gradient descent method lies in its rather simple mechanism and yet stable behavior with… 
Highly Cited
2018
Highly Cited
2018
Depending on how much information an adversary can access to, adversarial attacks can be classified as white-box attack and black… 
2018
2018
Learning a deep neural network requires solving a challenging optimization problem: it is a high-dimensional, non-convex and non… 
2017
2017
We propose a rank-$k$ variant of the classical Frank-Wolfe algorithm to solve convex optimization over a trace-norm ball. Our… 
Highly Cited
2016
Highly Cited
2016
We study Frank-Wolfe methods for nonconvex stochastic and finite-sum optimization problems. Frank-Wolfe methods (in the convex… 
Highly Cited
2014
Highly Cited
2014
The Frank-Wolfe method (a.k.a. conditional gradient algorithm) for smooth optimization has regained much interest in recent years… 
2014
2014
We study parallel and distributed Frank-Wolfe algorithms; the former on shared memory machines with mini-batching, and the latter… 
Highly Cited
2013
Highly Cited
2013
We present versions of the Frank-Wolfe method for linearly constrained convex programs, in which consecutive search directions… 
Highly Cited
2009
Highly Cited
2009
Abstract This article combines techniques from two fields of applied mathematics: optimization theory and inverse problems. We… 
Highly Cited
1985
Highly Cited
1985
We discuss methods for speeding up convergence of the Frank-Wolfe algorithm for solving nonlinear convex programs. Models…