# Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm

```@article{Clarkson2008CoresetsSG,
title={Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm},
author={Kenneth L. Clarkson},
journal={ACM Trans. Algorithms},
year={2008},
volume={6},
pages={63:1-63:30}
}```
• K. Clarkson
• Published 20 January 2008
• Computer Science
• ACM Trans. Algorithms
The problem of maximizing a concave function <i>f</i>(<i>x</i>) in a simplex <i>S</i> can be solved approximately by a simple greedy algorithm. For given <i>k</i>, the algorithm can find a point <i>x</i>(<i>k</i>) on a <i>k</i>-dimensional face of <i>S</i>, such that <i>f</i>(<i>x</i>(<i>k</i>)) ≥ <i>f</i>(<i>x</i>*) - <i>O</i>(1/<i>k</i>). Here <i>f</i>(<i>x</i>*) is the maximum value of <i>f</i> in <i>S.</i> This algorithm and analysis were known before, and related to problems of statistics…
412 Citations

### New approximation algorithms for minimum enclosing convex shapes

• Computer Science
SODA '11
• 2011
Two approximation algorithms for producing an enclosing ball whose radius is at most ε away from the optimum are given, which borrow heavily from convex duality and recently developed techniques in non-smooth optimization, and are in contrast with existing methods which rely on geometric arguments.

### Sparse Approximate Conic Hulls

• Computer Science, Mathematics
NIPS
• 2017
An approximate conic Caratheodory theorem is proved, a general sparsity result, that shows that any column of X can be \eps-approximated with an O(1/\eps^2) sparse combination from S, yielding the first provable, polynomial time \eps -approximation for this class of NMF problems.

### Analysis of the Frank–Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier

• Computer Science
Mathematical Programming
• 2022
A new generalized Frank–Wolfe method for the composite optimization problem is presented and analyzed.

### Robust vertex enumeration for convex hulls in high dimensions

• Computer Science
AISTATS
• 2018
The All Vertex Triangle Algorithm is presented, a robust and efficient algorithm for this problem that computes approximation to the subset S ¯ of all K vertices of the convex hull of S so that the conveX hull of the approximate subset of vertices is as close to conv ( S ) as desired.

### Dropping Convexity for Faster Semi-definite Optimization

• Mathematics, Computer Science
COLT
• 2016
This is the first paper to provide precise convergence rate guarantees for general convex functions under standard convex assumptions and to provide a procedure to initialize FGD for (restricted) strongly convex objectives and when one only has access to f via a first-order oracle.

### Bayesian Model Averaging With Exponentiated Least Squares Loss

• Computer Science
IEEE Transactions on Information Theory
• 2018
A primal-dual relationship of this Bayes estimator for deviation optimal model averaging by using exponentiated least squares loss is established and new algorithms that satisfactorily resolve the limitations of inline-formula-aggregation are proposed.

### Approximation and Streaming Algorithms for Projective Clustering via Random Projections

• Computer Science, Mathematics
CCCG
• 2015
A dimension reduction result is obtained that shows how to compute an "-approximate projective clustering for every k and simultaneously using only O((n + d)((q + 1) 2 log(1)="))=" 3 logn) space".

### Frank Wolfe Meets Metric Entropy

This paper provides a general technique for establishing domain spe- ciﬁc and easy-to-estimate lower bounds for Frank-Wolfe and its variants using the metric entropy of the domain and shows that a dimension- free linear upper bound must fail not only in the worst case, but in the average case.

### Sparse convex optimization methods for machine learning

A convergence proof guaranteeing e-small error is given after O( 1e ) iterations, and the sparsity of approximate solutions for any `1-regularized convex optimization problem (and for optimization over the simplex), expressed as a function of the approximation quality.

### Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization

A new general framework for convex optimization over matrix factorizations, where every Frank-Wolfe iteration will consist of a low-rank update, is presented, and the broad application areas of this approach are discussed.