• Corpus ID: 52092474

# Multiclass Universum SVM

@article{Dhar2018MulticlassUS,
title={Multiclass Universum SVM},
author={Sauptik Dhar and Vladimir Cherkassky and Mohak Shah},
journal={ArXiv},
year={2018},
volume={abs/1808.08111}
}
• Published 23 August 2018
• Computer Science
• ArXiv
We introduce Universum learning for multiclass problems and propose a novel formulation for multiclass universum SVM (MU-SVM). We also propose an analytic span bound for model selection with almost 2-4x faster computation times than standard resampling techniques. We empirically demonstrate the efficacy of the proposed MUSVM formulation on several real world datasets achieving > 20% improvement in test accuracies compared to multi-class SVM.
2 Citations
• Computer Science
Applied Intelligence
• 2021
This paper proposes a new method, called information entropy-based transductive support vector machine with Universum data (IEB-TUSVM), which mainly consists of two steps and analyzes the computational complexity of the proposed method.
• Computer Science
Advances in Intelligent Systems and Computing
• 2021
The most primitive acoustic parameter, spectrogram, is used as input feature, which contains more primitive speaker information, and using the mechanism of local perception and weight sharing of convolutional neural network (CNN), the spectrogram is automatically optimized, and dimensionality reduction is completed, thus avoiding the loss of information caused by empirical feature calculation.

## References

SHOWING 1-10 OF 33 REFERENCES

• Computer Science
ArXiv
• 2016
A span bound for MU-SVM that can be used for model selection thereby avoiding resampling is proposed and empirical results demonstrate the effectiveness of MU- SVM and the proposed bound.
• Computer Science
2017 International Joint Conference on Neural Networks (IJCNN)
• 2017
This paper proposes new Universum-SVM formulation for regression problems that incorporates a priori knowledge in the form of additional data samples that belong to the same application domain as the training samples, but they follow a different distribution.
• Computer Science
IEEE Transactions on Cybernetics
• 2015
This paper extends the U-SVM formulation to problems with different misclassification costs, and presents practical conditions for the effectiveness of this cost-sensitive U- SVM.
• Computer Science
Object recognition supported by user interaction for service robots
• 2002
The proposed transformation is based on simplifying the original problem and employing the Kesler construction which can be carried out by the use of properly defined kernel only and is comparable with the one-against-all decomposition solved by the state-of-the-art sequential minimal optimizer algorithm.
• Computer Science
IEEE Transactions on Pattern Analysis and Machine Intelligence
• 2012
UBoost is a boosting implementation of Vapnik's alternative capacity concept to the large margin approach that takes advantage of the available Universum data and controls the learned model's capacity by maximizing the number of observed contradictions.
• Computer Science
• 2014
A weighted Twin Support Vector Machine with Universum (called �� -WTSVM), where samples in the different positions are proposed to give different penalties has better flexibility of the algorithm and can obtain more reasonable classifier in most case.
• Computer Science
IJCAI
• 2009
This research found that not all the Universum samples are helpful, and a method to pick the informative ones, i.e., in-between universum samples is proposed, which outperforms the former works.
• Computer Science
NIPS
• 2007
We study a pattern classification algorithm which has recently been proposed by Vapnik and coworkers. It builds on a new inductive principle which assumes that in addition to positive and negative
• Computer Science
SDM
• 2008
This paper proposes a graph based method to make use of the Universum data to help depict the prior information for possible classifiers and shows that the proposed method can obtain superior performances over conventional supervised and semi-supervised methods.