• Corpus ID: 52092474

Multiclass Universum SVM

@article{Dhar2018MulticlassUS,
  title={Multiclass Universum SVM},
  author={Sauptik Dhar and Vladimir Cherkassky and Mohak Shah},
  journal={ArXiv},
  year={2018},
  volume={abs/1808.08111}
}
We introduce Universum learning for multiclass problems and propose a novel formulation for multiclass universum SVM (MU-SVM). We also propose an analytic span bound for model selection with almost 2-4x faster computation times than standard resampling techniques. We empirically demonstrate the efficacy of the proposed MUSVM formulation on several real world datasets achieving > 20% improvement in test accuracies compared to multi-class SVM. 
2 Citations

A new transductive learning method with universum data

This paper proposes a new method, called information entropy-based transductive support vector machine with Universum data (IEB-TUSVM), which mainly consists of two steps and analyzes the computational complexity of the proposed method.

Research on Multiple Overlapping Speakers Number Recognition Based on X-Vector

The most primitive acoustic parameter, spectrogram, is used as input feature, which contains more primitive speaker information, and using the mechanism of local perception and weight sharing of convolutional neural network (CNN), the spectrogram is automatically optimized, and dimensionality reduction is completed, thus avoiding the loss of information caused by empirical feature calculation.

References

SHOWING 1-10 OF 33 REFERENCES

Universum Learning for Multiclass SVM

A span bound for MU-SVM that can be used for model selection thereby avoiding resampling is proposed and empirical results demonstrate the effectiveness of MU- SVM and the proposed bound.

Universum learning for SVM regression

This paper proposes new Universum-SVM formulation for regression problems that incorporates a priori knowledge in the form of additional data samples that belong to the same application domain as the training samples, but they follow a different distribution.

Development and Evaluation of Cost-Sensitive Universum-SVM

This paper extends the U-SVM formulation to problems with different misclassification costs, and presents practical conditions for the effectiveness of this cost-sensitive U- SVM.

A nonparallel support vector machine for a classification problem with universum learning

Multi-class support vector machine

The proposed transformation is based on simplifying the original problem and employing the Kesler construction which can be carried out by the use of properly defined kernel only and is comparable with the one-against-all decomposition solved by the state-of-the-art sequential minimal optimizer algorithm.

{\cal U}Boost: Boosting with the Universum

UBoost is a boosting implementation of Vapnik's alternative capacity concept to the large margin approach that takes advantage of the available Universum data and controls the learned model's capacity by maximizing the number of observed contradictions.

Weighted Twin Support Vector Machine with Universum

A weighted Twin Support Vector Machine with Universum (called �� -WTSVM), where samples in the different positions are proposed to give different penalties has better flexibility of the algorithm and can obtain more reasonable classifier in most case.

Selecting Informative Universum Sample for Semi-Supervised Learning

This research found that not all the Universum samples are helpful, and a method to pick the informative ones, i.e., in-between universum samples is proposed, which outperforms the former works.

An Analysis of Inference with the Universum

We study a pattern classification algorithm which has recently been proposed by Vapnik and coworkers. It builds on a new inductive principle which assumes that in addition to positive and negative

Semi-Supervised Classification with Universum

This paper proposes a graph based method to make use of the Universum data to help depict the prior information for possible classifiers and shows that the proposed method can obtain superior performances over conventional supervised and semi-supervised methods.