Reduction from Cost-Sensitive Ordinal Ranking to Weighted Binary Classification


We present a reduction framework from ordinal ranking to binary classification. The framework consists of three steps: extracting extended examples from the original examples, learning a binary classifier on the extended examples with any binary classification algorithm, and constructing a ranker from the binary classifier. Based on the framework, we show that a weighted 0/1 loss of the binary classifier upper-bounds the mislabeling cost of the ranker, both error-wise and regret-wise. Our framework allows not only the design of good ordinal ranking algorithms based on well-tuned binary classification approaches, but also the derivation of new generalization bounds for ordinal ranking from known bounds for binary classification. In addition, our framework unifies many existing ordinal ranking algorithms, such as perceptron ranking and support vector ordinal regression. When compared empirically on benchmark data sets, some of our newly designed algorithms enjoy advantages in terms of both training speed and generalization performance over existing algorithms. In addition, the newly designed algorithms lead to better cost-sensitive ordinal ranking performance, as well as improved listwise ranking performance.

DOI: 10.1162/NECO_a_00265

Extracted Key Phrases

10 Figures and Tables

Citations per Year

52 Citations

Semantic Scholar estimates that this publication has 52 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Lin2012ReductionFC, title={Reduction from Cost-Sensitive Ordinal Ranking to Weighted Binary Classification}, author={Hsuan-Tien Lin and Ling Li}, journal={Neural computation}, year={2012}, volume={24 5}, pages={1329-67} }