Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm

@article{Clarkson2008CoresetsSG,
  title={Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm},
  author={Kenneth L. Clarkson},
  journal={ACM Trans. Algorithms},
  year={2008},
  volume={6},
  pages={63:1-63:30}
}
  • K. Clarkson
  • Published 20 January 2008
  • Computer Science
  • ACM Trans. Algorithms
The problem of maximizing a concave function <i>f</i>(<i>x</i>) in a simplex <i>S</i> can be solved approximately by a simple greedy algorithm. For given <i>k</i>, the algorithm can find a point <i>x</i>(<i>k</i>) on a <i>k</i>-dimensional face of <i>S</i>, such that <i>f</i>(<i>x</i>(<i>k</i>)) ≥ <i>f</i>(<i>x</i>*) - <i>O</i>(1/<i>k</i>). Here <i>f</i>(<i>x</i>*) is the maximum value of <i>f</i> in <i>S.</i> This algorithm and analysis were known before, and related to problems of statistics… 

Figures from this paper

New approximation algorithms for minimum enclosing convex shapes

TLDR
Two approximation algorithms for producing an enclosing ball whose radius is at most ε away from the optimum are given, which borrow heavily from convex duality and recently developed techniques in non-smooth optimization, and are in contrast with existing methods which rely on geometric arguments.

Sparse Approximate Conic Hulls

TLDR
An approximate conic Caratheodory theorem is proved, a general sparsity result, that shows that any column of X can be \eps-approximated with an O(1/\eps^2) sparse combination from S, yielding the first provable, polynomial time \eps -approximation for this class of NMF problems.

Analysis of the Frank–Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier

TLDR
A new generalized Frank–Wolfe method for the composite optimization problem is presented and analyzed.

Robust vertex enumeration for convex hulls in high dimensions

TLDR
The All Vertex Triangle Algorithm is presented, a robust and efficient algorithm for this problem that computes approximation to the subset S ¯ of all K vertices of the convex hull of S so that the conveX hull of the approximate subset of vertices is as close to conv ( S ) as desired.

Dropping Convexity for Faster Semi-definite Optimization

TLDR
This is the first paper to provide precise convergence rate guarantees for general convex functions under standard convex assumptions and to provide a procedure to initialize FGD for (restricted) strongly convex objectives and when one only has access to f via a first-order oracle.

Bayesian Model Averaging With Exponentiated Least Squares Loss

TLDR
A primal-dual relationship of this Bayes estimator for deviation optimal model averaging by using exponentiated least squares loss is established and new algorithms that satisfactorily resolve the limitations of inline-formula-aggregation are proposed.

Approximation and Streaming Algorithms for Projective Clustering via Random Projections

TLDR
A dimension reduction result is obtained that shows how to compute an "-approximate projective clustering for every k and simultaneously using only O((n + d)((q + 1) 2 log(1)="))=" 3 logn) space".

Frank Wolfe Meets Metric Entropy

TLDR
This paper provides a general technique for establishing domain spe- cific and easy-to-estimate lower bounds for Frank-Wolfe and its variants using the metric entropy of the domain and shows that a dimension- free linear upper bound must fail not only in the worst case, but in the average case.

Sparse convex optimization methods for machine learning

TLDR
A convergence proof guaranteeing e-small error is given after O( 1e ) iterations, and the sparsity of approximate solutions for any `1-regularized convex optimization problem (and for optimization over the simplex), expressed as a function of the approximation quality.

Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization

TLDR
A new general framework for convex optimization over matrix factorizations, where every Frank-Wolfe iteration will consist of a low-rank update, is presented, and the broad application areas of this approach are discussed.
...

References

SHOWING 1-10 OF 49 REFERENCES

Smaller core-sets for balls

TLDR
It is shown that any point-set has an ∊-core-set of size [2/∊], and a fast algorithm is given that finds this core-set and implies the existence of small core-sets for solving approximate approximate <i>k</i>-center clustering and related problems.

Applications of weighted Voronoi diagrams and randomization to variance-based k-clustering: (extended abstract)

TLDR
The optimum solution to the k-clustering problem is characterized by the ordinary Euclidean Voronoi diagram and the weighted Vor onoi diagram with both multiplicative and additive weights.

A probabilistic algorithm for the post office problem

TLDR
The algorithm employs random sampling, so the expected time holds for any set of points, and approaches the preprocessing time required for any algorithm constructing the Voronoi diagram of the input points.

Coresets for polytope distance

TLDR
The coreset framework is translated to the problems of finding the point closest to the origin inside a polytope, finding the shortest distance between two polytopes, Perceptrons, and soft- as well as hard-margin Support Vector Machines (SVM).

A Randomized Algorithm for Closest-Point Queries

TLDR
This result approaches the $\Omega (n^{\lceil {{d / 2}} \rceil } )$ worst-case time required for any algorithm that constructs the Voronoi...

Further applications of random sampling to computational geometry

TLDR
This paper gives several new demonstrations of the usefulness of random sampling techniques in computational geometry by creating a search structure for arrangements of hyperplanes by sampling the hyperplanes and using information from the resulting arrangement to divide and conquer.

The Computational Complexity of Densest Region Detection

TLDR
A formal learning model for this task that uses a hypothesis class as it “anti-overfitting” mechanism is introduced and it is shown that for some constants, depending on the hypothesis class, these problems are NP-hard to approximate to within these constant factors.

Rounding of Polytopes in the Real Number Model of Computation

TLDR
It is shown that the problem of 1 + en-rounding of A can be solved in Om3.5 lnme-1 operations to a relative accuracy of e in the volume, and that bounds hold for the real number model of computation.

Minimum-Volume Enclosing Ellipsoids and Core Sets

TLDR
A modification of the Khachiyan first-order algorithm is proposed with the property that the minimum-volume enclosing ellipsoid of the point set X provides a good approximation to that of S, and the size of X depends on only the dimension d and ε, but not on the number of points n.

Relating Data Compression and Learnability

TLDR
It is demonstrated that the existence of a suitable data compression scheme is sufficient to ensure learnability and the introduced compression scheme provides a rigorous model for studying data compression in connection with machine learning.