Trading convexity for scalability

  title={Trading convexity for scalability},
  author={Ronan Collobert and Fabian H. Sinz and Jason Weston and L{\'e}on Bottou},
Convex learning algorithms, such as Support Vector Machines (SVMs), are often seen as highly desirable because they offer strong practical properties and are amenable to theoretical analysis. However, in this work we show how non-convexity can provide scalability advantages over convexity. We show how concave-convex programming can be applied to produce (i) faster SVMs where training errors are no longer support vectors, and (ii) much faster Transductive SVMs. 
Highly Influential
This paper has highly influenced 38 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 363 citations. REVIEW CITATIONS
182 Citations
10 References
Similar Papers


Publications citing this paper.
Showing 1-10 of 182 extracted citations

363 Citations

Citations per Year
Semantic Scholar estimates that this publication has 363 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 10 references

On (psi)-learning

  • X. Shen, G. C. Tseng, X. Zhang, W. H. Wong
  • Journal of the American Statistical Association,
  • 2003
Highly Influential
4 Excerpts

Analyse numérique des algorithmes de l’optimisation D.C. Approches locales et globale

  • H. A. Le Thi
  • Codes et simulations numériques en grande…
  • 1994
Highly Influential
4 Excerpts

Similar Papers

Loading similar papers…