Regularized Least-Squares Classification 133 In practice , although

@inproceedings{Rifkin2007RegularizedLC,
  title={Regularized Least-Squares Classification 133 In practice , although},
  author={Ryan Rifkin and Geon-Min Yeo and Tomaso A. Poggio},
  year={2007}
}
We consider the solution of binary classification problems via Tikhonov regularization in a Reproducing Kernel Hilbert Space using the square loss, and denote the resulting algorithm Regularized Least-Squares Classification (RLSC). We sketch the historical developments that led to this algorithm, and demonstrate empirically that its performance is equivalent to that of the well-known Support Vector Machine on several datasets. Whereas training an SVM requires solving a convex quadratic program… CONTINUE READING
Highly Cited
This paper has 55 citations. REVIEW CITATIONS
34 Extracted Citations
35 Extracted References
Similar Papers

Citing Papers

Publications influenced by this paper.
Showing 1-10 of 34 extracted citations

56 Citations

0510'07'10'13'16
Citations per Year
Semantic Scholar estimates that this publication has 56 citations based on the available data.

See our FAQ for additional information.

Referenced Papers

Publications referenced by this paper.
Showing 1-10 of 35 references

Efficient application of interior point methods for quadratic problems arising in support vector machines using low-rank kernel representation

  • S. Fine, K. Scheinberg
  • Submitted to Mathematical Programming
  • 2001
Highly Influential
13 Excerpts

Everything Old Is New Again: A Fresh Look at Historical Approaches to Machine Learning

  • R. M. Rifkin
  • PhD thesis, Massachusetts Institute of Technology
  • 2002
Highly Influential
4 Excerpts

Bayesian inference for lssvms on large data sets using the Nyström method

  • T. Van Gestel, J. Suykens, B. De Moor, J. Vandewalle
  • International Joint Conference on Neural Networks
  • 2002
1 Excerpt

Similar Papers

Loading similar papers…