SVMTorch: Support Vector Machines for Large-Scale Regression Problems

@article{Collobert2001SVMTorchSV,
  title={SVMTorch: Support Vector Machines for Large-Scale Regression Problems},
  author={Ronan Collobert and Samy Bengio},
  journal={J. Mach. Learn. Res.},
  year={2001},
  volume={1},
  pages={143-160}
}
Keywords: learning Reference EPFL-REPORT-82604 URL: http://publications.idiap.ch/downloads/reports/2000/rr00-17.pdf Record created on 2006-03-10, modified on 2017-05-10 

Tables and Topics from this paper

Large Scale Machine Learning
TLDR
This paper presents a meta-analyses of Collobert’s 2004 Phdthesis and its implications for the design of post-graduate education and pedagogical practices. Expand
Online training of Support Vector Regression
  • Haisheng Li
  • Computer Science
  • 2010 Sixth International Conference on Natural Computation
  • 2010
TLDR
An online support vector regression for regression problems that have input data supplied in sequence rather than in batch is presented, showing that the OSVR algorithm has a much faster convergence and results in a smaller number of support vectors and a better generalization performance in comparison with the existing algorithms. Expand
SvmFu : Software For SVMs
Motivation: Because theQ matrix in the SVM training problem is in general dense and of full rank, and has size equal to the number of data points squared, traditional “off the shelf” QP methods areExpand
Efficient parameter selection for support vector machines
TLDR
This work investigates the theory that justifies P-SVM for tuning and shows that P- SVM significantly improved accuracy for classifying the business intelligence data and reduces computational time substantially without much loss in accuracy. Expand
Support Vector Machines for Regression: A Succinct Review of Large-Scale and Linear Programming Formulations
TLDR
The most common learning methods for SVRs are introduced and linear programming-based SVR formulations are explained emphasizing its suitability for large-scale learning. Expand
An Improved Way to Make Large-Scale SVR Learning Practical
TLDR
A fast training algorithm for simplified support vector regression, sequential minimal optimization (SMO) which was used to train SVM before is described and it is proved that this new method converges considerably faster than other methods that require the presence of a substantial amount of the data in memory. Expand
An Improved Way to Make Large-Scale SVR Learning Practical
TLDR
A fast training algorithm for simplified support vector regression, sequential minimal optimization (SMO) which was used to train SVM before is described, proving that this new method converges considerably faster than other methods that require the presence of a substantial amount of the data in memory. Expand
Active Learning with Support Vector Machines
This thesis examines the use of support vector machines for active learning using linear, polynomial and radial basis function kernels. In our experiments we used named entity recognition which wasExpand
Supervised Learning by Support Vector Machines
  • G. Steidl
  • Computer Science
  • Handbook of Mathematical Methods in Imaging
  • 2015
TLDR
This chapter gives a brief introduction into the basic concepts of supervised support vector learning and touches some recent developments in this broad field. Expand
Optimizing F-Measure with Support Vector Machines
TLDR
It is demonstrated that with the right parameter settings SVMs approximately optimize F-measure in the same way that SVMs have already been known to approximately optimize accuracy. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 20 REFERENCES
On the Convergence of SVMTorch, an Algorithm for Large-Scale Regression Problems
TLDR
The EPFL-REPORT-82616 serves as a reference for further studies on how language and language-based pedagogical techniques can be applied in the context of professional and post-graduate education. Expand
Support Vector Regression Machines
TLDR
This work compares support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space and expects that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space. Expand
Improvements to the SMO algorithm for SVM regression
TLDR
Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO for regression that perform significantly faster than the original SMO on the datasets tried. Expand
Making large-scale support vector machine learning practical
TLDR
This chapter presents algorithmic and computational results developed for SV M light V2.0, which make large-scale SVM training more practical and give guidelines for the application of SVMs to large domains. Expand
Predicting Time Series with Support Vector Machines
TLDR
Two different cost functions for Support Vectors are made use: training with an e insensitive loss and Huber's robust loss function and how to choose the regularization parameters in these models are discussed. Expand
An improved training algorithm for support vector machines
  • E. Osuna, R. Freund, F. Girosi
  • Computer Science
  • Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop
  • 1997
TLDR
This paper presents a decomposition algorithm that is guaranteed to solve the QP problem and that does not make assumptions on the expected number of support vectors. Expand
Fast training of support vector machines using sequential minimal optimization, advances in kernel methods
TLDR
SMO breaks this large quadratic programming problem into a series of smallest possible QP problems, which avoids using a time-consuming numerical QP optimization as an inner loop and hence SMO is fastest for linear SVMs and sparse data sets. Expand
Improvements to Platt's SMO Algorithm for SVM Classifier Design
TLDR
Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO that perform significantly faster than the original SMO on all benchmark data sets tried. Expand
An Improved Decomposition Algorithm for Regression Support Vector Machines
  • P. Laskov
  • Mathematics, Computer Science
  • NIPS
  • 1999
TLDR
A new decomposition algorithm for training regression Support Vector Machines (SVM) that builds on the basic principles of decomposition proposed by Osuna et. Expand
On the convergence of the decomposition method for support vector machines
  • Chih-Jen Lin
  • Computer Science, Medicine
  • IEEE Trans. Neural Networks
  • 2001
TLDR
The asymptotic convergence of the algorithm used by the software SVM(light) and other later implementation is proved and the size of the working set can be any even number. Expand
...
1
2
...