• Corpus ID: 1648794

Support Vector Machines and Area Under ROC curve

@inproceedings{Rakotomamonjy2004SupportVM,
  title={Support Vector Machines and Area Under ROC curve},
  author={Alain Rakotomamonjy},
  year={2004}
}
For many years now, there is a growing interest around ROC curve for characterizing machine learning performances. This is particularly due to the fact that in real-world problems misclassification costs are not known and thus, ROC curve and related metrics such as the Area Under ROC curve (AUC) can be a more meaningful performance measures. In this paper, we propose a SVMs based algorithm for AUC maximization and show that under certain conditions this algorithm is related to 2-norm soft… 

Figures and Tables from this paper

Maximizing area under ROC curve for biometric scores fusion

Between AUC based and error rate based learning

  • K. Toh
  • Computer Science
    2008 3rd IEEE Conference on Industrial Electronics and Applications
  • 2008
This paper investigates into the relationship between AUC and several error rate based classifiers and finds that the AUC based classifier can be related to a total-error-rate (TER) classifier, an Equal Error Rate (EER) formulation, and a least-squares-error (LSE) estimator, each under a specific setting of the translated scaling-space framework.

Diagnosis of Several Diseases by Using Combined Kernels with Support Vector Machine

This study proposes a new method of Support Vector Machines for influential classification using combined kernel functions, which is a type of non-linear classifier, and shows that the new method provides a significant improvement in terms of the probability excess.

Improving Semantic Concept Detection Through Optimizing Ranking Function

Kernel rank is capable of training any differentiable classifier with various kernels; and the learned ranking function performs better than traditional maximization likelihood or classification error minimization based algorithms in terms of AUC and average precision (AP).

PRIE: a system for generating rulelists to maximize ROC performance

  • Tom Fawcett
  • Computer Science
    Data Mining and Knowledge Discovery
  • 2008
A method for learning rules directly from ROC space when the goal is to maximize the area under the ROC curve (AUC) and basic principles from rule learning and computational geometry are used to focus the search for promising rule combinations.

Statistical methods with applications to machine learning and artificial intelligence

This thesis proposes an innovative static path-planning algorithm called m-A* within an environment full of obstacles, which has lower worst-case order magnitude of computation complexity and reduces the number of vertex expansion compared to the benchmark A* algorithm in the simulation study.

Support Vector Machines and Affective Science

A framework for understanding the methods of SVM-based analyses is provided and the findings of seminal studies that use SVMs for classification or data reduction in the behavioral and neural study of emotion and affective disorders are summarized.

Signal Detection for QPSK Based Cognitive Radio Systems using Support Vector Machines

It is shown that the combination of statistical signal processing and machine learning concepts improve the spectrum sensing process and spectrum sensing is possible even at low Signal to Noise Ra- tio (SNR) values up to 50 dB.

Feature selection for an improved Parkinson's disease identification based on handwriting

It is suggested that handwriting can be a valuable marker as a PD diagnosis tool by finding a subset of handwriting features suitable for efficiently identifying subjects with PD.

References

SHOWING 1-10 OF 40 REFERENCES

Comparing naive Bayes, decision trees, and SVM with AUC and accuracy

It is proved that AUC is, in general, a better measure (defined precisely) than accuracy for evaluating performance of learning algorithms.

Learning Decision Trees Using the Area Under the ROC Curve

This paper shows how a single decision tree can represent a set of classifiers by choosing different labellings of its leaves, or equivalently, an ordering on the leaves, and proposes a novel splitting criterion which chooses the split with the highest local AUC.

AUC: A Better Measure than Accuracy in Comparing Learning Algorithms

R rigourously is established that, even in this setting, the area under the ROC (Receiver Operating Characteristics) curve, or simply AUC, provides a better measure than accuracy when measuring and comparing classification systems.

Optimizing F-Measure with Support Vector Machines

It is demonstrated that with the right parameter settings SVMs approximately optimize F-measure in the same way that SVMs have already been known to approximately optimize accuracy.

AUC Optimization vs. Error Rate Minimization

The results show that the average AUC is monotonically increasing as a function of the classification accuracy, but that the standard deviation for uneven distributions and higher error rates is noticeable, so algorithms designed to minimize the error rate may not lead to the best possible AUC values.

Decision Tree with Better Ranking

This paper presents a novel probability estimation algorithm that improves the AUC value of decision trees by averaging probability estimates from all leaves of the tree.

Evaluation of simple performance measures for tuning SVM hyperparameters

Improving Accuracy and Cost of Two-class and Multi-class Probabilistic Classifiers Using ROC Curves

A hillclimbing approach which adjusts the weights for each class in a pre-defined order is proposed which leads to significant improvements over the naive Bayes classifier's accuracy.

Optimizing Classifier Performance Via the Wilcoxon-Mann-Whitney Statistic

This work proposes an objective function that is an approximation to the Wilcoxon-Mann-Whitney statistic, which is equivalent to AUC, and applies it to real-world customer behavior prediction problems for a wireless service provider and a cable service provider, and achieves reliable andSignificant improvements in the ROC curve.

Bounds on Error Expectation for Support Vector Machines

It is proved that the value of the span is always smaller (and can be much smaller) than the diameter of the smallest sphere containing the support vectors, used in previous bounds.