Sublinear Optimization for Machine Learning

Abstract

In this article we describe and analyze sublinear-time approximation algorithms for some optimization problems arising in machine learning, such as training linear classifiers and finding minimum enclosing balls. Our algorithms can be extended to some kernelized versions of these problems, such as SVDD, hard margin SVM, and <i>L</i><sub>2</sub>-SVM, for which sublinear-time algorithms were not known before. These new algorithms use a combination of a novel sampling techniques and a new multiplicative update algorithm. We give lower bounds which show the running times of many of our algorithms to be nearly best possible in the unit-cost RAM model.

DOI: 10.1145/2371656.2371658

Extracted Key Phrases

2 Figures and Tables

01020302011201220132014201520162017
Citations per Year

86 Citations

Semantic Scholar estimates that this publication has 86 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Clarkson2010SublinearOF, title={Sublinear Optimization for Machine Learning}, author={Kenneth L. Clarkson and Elad Hazan and David P. Woodruff}, journal={2010 IEEE 51st Annual Symposium on Foundations of Computer Science}, year={2010}, pages={449-457} }