Soft-SVM Regression For Binary Classification

  title={Soft-SVM Regression For Binary Classification},
  author={Man-Hsin Huang and Luis Carvalho},
The binomial deviance and the SVM hinge loss functions are two of the most widely used loss functions in machine learning. While there are many similarities between them, they also have their own strengths when dealing with different types of data. In this work, we introduce a new exponential family based on a convex relaxation of the hinge loss function using softness and class-separation parameters. This new family, denoted Soft-SVM, allows us to prescribe a generalized linear model that… 

Figures and Tables from this paper



Kernel Logistic Regression and the Import Vector Machine

It is shown that the IVM not only performs as well as the SVM in two-class classification, but also can naturally be generalized to the multiclass case, and provides an estimate of the underlying probability.

Support Vector Machines and the Bayes Rule in Classification

  • Yi Lin
  • Computer Science
    Data Mining and Knowledge Discovery
  • 2004
It is shown that the asymptotic target of SVMs are some interesting classification functions that are directly related to the Bayes rule, and helps understand the success of SVM in many classification studies, and makes it easier to compare SVMs and traditional statistical methods.

Cross-study validation for the assessment of prediction algorithms

This work develops and implements a systematic approach to ‘cross-study validation’, to replace or supplement conventional cross-validation when evaluating high-dimensional prediction models in independent datasets, and suggests that standard cross- validation produces inflated discrimination accuracy for all algorithms considered, when compared to cross- study validation.

Generalized Linear Models

This is the Ž rst book on generalized linear models written by authors not mostly associated with the biological sciences, and it is thoroughly enjoyable to read.

The Nature of Statistical Learning Theory

  • V. Vapnik
  • Computer Science
    Statistics for Engineering and Information Science
  • 2000
Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing

The Elements of Statistical Learning: Data Mining, Inference, and Prediction

This book is a valuable resource, both for the statistician needing an introduction to machine learning and related Ž elds and for the computer scientist wishing to learn more about statistics, and statisticians will especially appreciate that it is written in their own language.

The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation

This article shows how MCC produces a more informative and truthful score in evaluating binary classifications than accuracy and F1 score, by first explaining the mathematical properties, and then the asset of MCC in six synthetic use cases and in a real genomics scenario.

The Nature of Statistical Learning

Support Vector Machines Applications

This book focuses on the recent advances and applications of the SVM, such as image processing, medical practice, computer vision, and pattern recognition, machine learning, applied statistics, and artificial intelligence.