Support vector learning for ordinal regression

@inproceedings{Herbrich1999SupportVL,
  title={Support vector learning for ordinal regression},
  author={Ralf Herbrich and Thore Graepel and Klaus Obermayer},
  year={1999}
}
We investigate the problem of predicting variables of ordinal scale. This task is referred to as ordinal regression and is complementary to the standard machine learning tasks of classification and metric regression. In contrast to statistical models we present a distribution independent formulation of the problem together with uniform bounds of the risk functional. The approach presented is based on a mapping from objects to scalar utility values. Similar to support vector methods we derive a… 

Figures from this paper

Prediction of Ordinal Classes Using Regression Trees

TLDR
Preliminary results indicate that this is a promising avenue towards algorithms that combine aspects of classification and regression, and the trade-off between optimal categorical classification accuracy (hit rate) and minimum distance-based error is studied.

Large-Scale Linear Support Vector Ordinal Regression Solver

TLDR
This work develops an efficient solver for training large-scale linear SVOR based on alternating direction method of multipliers(ADMM) and enjoys advantages in terms of both training speed and generalization performance over the method based on SMO, which invalidate the effectiveness and efficiency of the algorithm.

A neural network approach to ordinal regression

TLDR
An effective approach to adapt a traditional neural network to learn ordinal categories is described, a generalization of the perceptron method for ordinal regression, which outperforms a neural network classification method.

Ordinal Regression with K-SVCR Machines

TLDR
The ordinal regression problem or ordination is fomulated from the viewpoint of a recently defined learning architecture based on support vectors, the K-SVCR learning machine, specially developed to treat with multiple classes.

Ordinal Regression Methods: Survey and Experimental Study

TLDR
The results confirm that ordering information benefits ordinal models improving their accuracy and the closeness of the predictions to actual targets in the ordinal scale.

Graph-Based Approaches for Over-Sampling in the Context of Ordinal Regression

TLDR
A specific ordinal over-sampling method is developed in this paper for the first time in order to improve the performance of machine learning classifiers and includes ordinal information by approaching over- sampling from a graph-based perspective.

A Novel Large-scale Ordinal Regression Model

TLDR
This paper considers NPSVOR's linear case and design an efficient training method based on the dual coordinate descent method (DCD), and to utilize the order information among labels in prediction, a new prediction function is also proposed.

Minimum class variance support vector ordinal regression

TLDR
A novel method to handle the ordinal regression problems, referred to as minimum class variance support vector ordinals regression (MCVSVOR), which explicitly takes into account the distribution of the categories and achieves better generalization performance in contrast with SVOR.

Ordinal Regression as Structured Classification

TLDR
The net effect is to transform the underlying problem from an ordinal regression task to a (structured) classification task which the authors solve with conditional random fields, thereby achieving a coherent and probabilistic model in which all model parameters are jointly learnt.

Negative Correlation Ensemble Learning for Ordinal Regression

TLDR
Two neural network threshold ensemble models are proposed for ordinal regression problems, each of which considers the thresholds of each member of the ensemble as free parameters, allowing their modification during the training process.
...

References

SHOWING 1-10 OF 22 REFERENCES

Regression Models for Ordinal Data: A Machine Learning Approach

TLDR
A distribution independent formulation of the problem is developed, an approach based on a mapping from objects to scalar utility values and thus guarantees transitivity and asymmetry is presented and a new learning algorithm based on large margin rank boundari es is derived for the task of ordinal regression.

Structural Risk Minimization Over Data-Dependent Hierarchies

TLDR
A result is presented that allows one to trade off errors on the training sample against improved generalization performance, and a more general result in terms of "luckiness" functions, which provides a quite general way for exploiting serendipitous simplicity in observed data to obtain better prediction accuracy from small training sets.

Gaussian regression and optimal finite dimensional linear models

The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the

Complete ranking procedures with appropriate loss functions

This paper treats the problem of comparing different evaluations of procedures which rank the variances of k normal populations. Procedures are evaluated on the basis of appropriate loss functions

Constructing Quadratic and Polynomial Objective Functions

A model for constructing quadratic and polynomial objective functions in n target variables from interviewing an expert is considered. The person interviewed is presented a set of incomplete

Support Vector Machines Applied to Face Recognition

TLDR
A SVM -based face recognition algorithm that is compared with a principal component analysis (PCA) based algorithm on a difficult set of images from the FERET database and generated a similarity metric between faces that is learned from examples of differences between faces.

Statistical learning theory

TLDR
Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

Automatic Information Organization And Retrieval

TLDR
A new book enPDFd automatic information organization and retrieval that can be a new way to explore the knowledge and get one thing to always remember in every reading time, even step by step is shown.

Learning with kernels

TLDR
This book is intended to be a guide to the art of self-consistency and should not be used as a substitute for a comprehensive guide to self-confidence.

Regression and ordered categorical variables (with discussion)

  • Journal of the Royal Statistical Society – Series B
  • 1984