Robust Truncated Hinge Loss Support Vector Machines

Abstract

The support vector machine (SVM) has been widely applied for classification problems in both machine learning and statistics. Despite its popularity, however, SVM has some drawbacks in certain situations. In particular, the SVM classifier can be very sensitive to outliers in the training sample. Moreover, the number of support vectors (SVs) can be very large in many applications. To circumvent these drawbacks, we propose the robust truncated hinge loss SVM (RSVM), which uses a truncated hinge loss. The RSVM is shown to be more robust to outliers and to deliver more accurate classifiers using a smaller set of SVs than the standard SVM. Our theoretical results show that the RSVM is Fisher-consistent, even when there is no dominating class, a scenario that is particularly challenging for multicategory classification. Similar results are obtained for a class of margin-based classifiers.

Extracted Key Phrases

8 Figures and Tables

010203020072008200920102011201220132014201520162017
Citations per Year

142 Citations

Semantic Scholar estimates that this publication has 142 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Wu2007RobustTH, title={Robust Truncated Hinge Loss Support Vector Machines}, author={Yichao Wu and Yufeng Liu}, year={2007} }