Support Vector Machines for Classification in Nonstandard Situations
@article{Lin2004SupportVM, title={Support Vector Machines for Classification in Nonstandard Situations}, author={Yi Lin and Yoonkyung Lee and Grace Wahba}, journal={Machine Learning}, year={2004}, volume={46}, pages={191-202} }
The majority of classification algorithms are developed for the standard situation in which it is assumed that the examples in the training set come from the same distribution as that of the target population, and that the cost of misclassification into different classes are the same. However, these assumptions are often violated in real world settings. For some classification methods, this can often be taken care of simply with a change of threshold; for others, additional effort is required…
373 Citations
Near-Bayesian Support Vector Machines for imbalanced data classification with equal or unequal misclassification costs
- Computer ScienceNeural Networks
- 2015
Statistical Properties and Adaptive Tuning of Support Vector Machines
- Computer ScienceMachine Learning
- 2004
An approach to adaptively tuning the smoothing parameter(s) in the SVMs is described, based on the generalized approximate cross validation (GACV), which is an easily computable proxy of the GCKL.
On the Support Vector Machine
- Computer Science
- 2003
It is shown that the support vector machine enjoys excellent theoretical properties which explain the good performance of the SVM, and its expected misclassification rate quickly converges to that of the Bayes rule.
Error Control for Support Vector Machines
- Computer Science
- 2007
This work considers two learning frameworks for minimax classification and Neyman-Pearson classification, and shows that its approach, based on cost-sensitive support vector machines, significantly outperforms methods typically used in practice.
Support Vector Machines and the Bayes Rule in Classification
- Computer ScienceData Mining and Knowledge Discovery
- 2004
It is shown that the asymptotic target of SVMs are some interesting classification functions that are directly related to the Bayes rule, and helps understand the success of SVM in many classification studies, and makes it easier to compare SVMs and traditional statistical methods.
Cost-Sensitive Universum-SVM
- Computer Science2012 11th International Conference on Machine Learning and Applications
- 2012
This paper extends the U-SVM for problems with different misclassification costs, and presents practical conditions for the effectiveness of the cost sensitive U- SVM.
On support vector machines under a multiple-cost scenario
- Computer ScienceAdv. Data Anal. Classif.
- 2019
A novel SVM model is proposed in which misclassification costs are considered by incorporating performance constraints in the problem formulation, and the aim is to seek the hyperplane with maximal margin yielding mis classification rates below given threshold values.
Posterior probability support vector Machines for unbalanced data
- Computer ScienceIEEE Transactions on Neural Networks
- 2005
The proposed PPSVM is a natural and an analytical extension of regular SVMs based on the statistical learning theory and is closer to the Bayes optimal without knowing the distributions.
Development and Evaluation of Cost-Sensitive Universum-SVM
- Computer ScienceIEEE Transactions on Cybernetics
- 2015
This paper extends the U-SVM formulation to problems with different misclassification costs, and presents practical conditions for the effectiveness of this cost-sensitive U- SVM.
The L q Support Vector Machine
- Computer Science
- 2009
It is shown that the new adaptive approach combines the benefit of a class of non-adaptive procedures and gives the best performance of this class across a variety of situations and is more robust to noise variables than the L1 and L2 penalties.
References
SHOWING 1-10 OF 19 REFERENCES
Support Vector Machines and the Bayes Rule in Classification
- Computer ScienceData Mining and Knowledge Discovery
- 2004
It is shown that the asymptotic target of SVMs are some interesting classification functions that are directly related to the Bayes rule, and helps understand the success of SVM in many classification studies, and makes it easier to compare SVMs and traditional statistical methods.
Advances in Large Margin Classifiers
- Computer Science
- 2000
This book provides an overview of recent developments in large margin classifiers, examines connections with other methods, and identifies strengths and weaknesses of the method, as well as directions for future research.
A training algorithm for optimal margin classifiers
- Computer ScienceCOLT '92
- 1992
A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions,…
Dynamically Adapting Kernels in Support Vector Machines
- Computer ScienceNIPS
- 1998
In this procedure model selection and learning are not separate, but kernels are dynamically adjusted during the learning process to find the kernel parameter which provides the best possible upper bound on the generalisation error.
Knowledge-based analysis of microarray gene expression data by using support vector machines.
- Computer ScienceProceedings of the National Academy of Sciences of the United States of America
- 2000
A method of functionally classifying genes by using gene expression data from DNA microarray hybridization experiments, based on the theory of support vector machines (SVMs), to predict functional roles for uncharacterized yeast ORFs based on their expression data is introduced.
Optimizing Classifers for Imbalanced Training Sets
- Computer ScienceNIPS
- 1998
This paper investigates the implications of results for the case of imbalanced datasets and develops two approaches to setting the threshold, incorporated into ThetaBoost, a boosting algorithm for dealing with unequal loss functions.
Construction and Assessment of Classification Rules
- PsychologyTechnometrics
- 1999
We may not be able to make you love reading, but construction and assessment of classification rules will lead you to love reading starting from now. Book is the window to open the new world. The…
A Unified Framework for Regularization Networks and Support Vector Machines
- Computer Science
- 1999
This work presents regularization Networks and Support Vector Machines in a unified framework in the context of Vapnik''s theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics.
Asymptotic Analysis of Penalized Likelihood and Related Estimators
- Mathematics
- 1990
A general approach to the first order asymptotic analysis ofpenalized likelihood and related estimators is described. The method gives expansions for the systematic and random error. Asymptotic…
Estimation of Dependences Based on Empirical Data
- Philosophy
- 2006
Realism and Instrumentalism: Classical Statistics and VC Theory (1960-1980).- Falsifiability and Parsimony: VC Dimension and the Number of Entities (1980-2000).- Noninductive Methods of Inference:…