Support Vector Regression Machines

Abstract

A new regression technique based on Vapnik’s concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimension&y of the input space. This is a longer version of the paper to appear in Advances in Neural Inform&on Processing Systems 9 (proceedings of the 1996 conference)

Extracted Key Phrases

7 Figures and Tables

0100200'98'00'02'04'06'08'10'12'14'16
Citations per Year

1,783 Citations

Semantic Scholar estimates that this publication has 1,783 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Drucker1996SupportVR, title={Support Vector Regression Machines}, author={Harris Drucker and Christopher J. C. Burges and Linda Kaufman and Alexander J. Smola and Vladimir Vapnik}, booktitle={NIPS}, year={1996} }