# Large Scale Transductive SVMs

@article{Collobert2006LargeST, title={Large Scale Transductive SVMs}, author={Ronan Collobert and Fabian H Sinz and J. Weston and L. Bottou}, journal={J. Mach. Learn. Res.}, year={2006}, volume={7}, pages={1687-1712} }

We show how the concave-convex procedure can be applied to transductive SVMs, which traditionally require solving a combinatorial search problem. This provides for the first time a highly scalable algorithm in the nonlinear case. Detailed experiments verify the utility of our approach. Software is available at http://www.kyb.tuebingen.mpg.de/bs/people/fabee/transduction.html .

#### Figures, Tables, and Topics from this paper

#### 459 Citations

1 Trading Convexity for Scalability

- 2008

Convex learning algorithms, such as Support Vector Machines (SVMs), are often seen as highly desirable because they offer strong practical properties and are amenable to theoretical analysis.… Expand

Trading convexity for scalability

- Computer Science, Mathematics
- ICML
- 2006

It is shown how concave-convex programming can be applied to produce faster SVMs where training errors are no longer support vectors, and much faster Transductive SVMs. Expand

Large margin vs. large volume in transductive learning

- Mathematics, Computer Science
- Machine Learning
- 2008

We consider a large volume principle for transductive learning that prioritizes the transductive equivalence classes according to the volume they occupy in hypothesis space. We approximate volume… Expand

Robust transductive support vector machines

- Mathematics, Computer Science
- 2016 24th Signal Processing and Communication Application Conference (SIU)
- 2016

A robust transductive support vector machine (RTSVM) classifier that is suitable for large-scale data is proposed that uses the robust Ramp loss instead of the Hinge loss for labeled data samples. Expand

Efficient Convex Relaxation for Transductive Support Vector Machine

- Computer Science, Mathematics
- NIPS
- 2007

This work proposes solving Transductive SVM via a convex relaxation, which converts the NP-hard problem to a semi-definite programming, and shows the promising performance of the proposed algorithm in comparison with other state-of-the-art implementations of Transductives SVM. Expand

Large-scale robust transductive support vector machines

- Mathematics, Computer Science
- Neurocomputing
- 2017

A robust and fast transductive support vector machine (RTSVM) classifier that can be applied to large-scale data that uses the robust Ramp loss instead of Hinge loss for labeled data samples is proposed. Expand

Large scale manifold transduction

- Mathematics, Computer Science
- ICML '08
- 2008

This work shows how the regularizer of Transductive Support Vector Machines can be trained by stochastic gradient descent for linear models and multi-layer architectures, and proposes a natural generalization of the TSVM loss function that takes into account neighborhood and manifold information directly. Expand

Newton Methods for Fast Solution of Semi- supervised Linear SVMs

- 2006

In this chapter, we present a family of semi-supervised linear support vector classifiers that are designed to handle partially-labeled sparse datasets with possibly very large number of examples and… Expand

Large scale semi-supervised linear SVMs

- Computer Science
- SIGIR
- 2006

An implementation of Transductive SVM (TSVM) that is significantly more efficient and scalable than currently used dual techniques, for linear classification problems involving large, sparse datasets, and a variant of TSVM that involves multiple switching of labels. Expand

A Multi-kernel Framework for Inductive Semi-supervised Learning

- Computer Science
- ESANN
- 2011

The multiple kernel version of Transductive SVM (a cluster assumption based approach) is proposed and it is solved based on DC (Difference of Convex functions) programming. Expand

#### References

SHOWING 1-10 OF 47 REFERENCES

Trading convexity for scalability

- Computer Science, Mathematics
- ICML
- 2006

It is shown how concave-convex programming can be applied to produce faster SVMs where training errors are no longer support vectors, and much faster Transductive SVMs. Expand

Learning with Local and Global Consistency

- Computer Science, Mathematics
- NIPS
- 2003

A principled approach to semi-supervised learning is to design a classifying function which is sufficiently smooth with respect to the intrinsic structure collectively revealed by known labeled and unlabeled points. Expand

Leveraging the margin more carefully

- Computer Science
- ICML
- 2004

Two leveraging algorithms that build on boosting techniques and employ a bounded loss function of the margin are described, which decomposes a non-convex loss into a difference of two convex losses. Expand

Semi-Supervised Classification by Low Density Separation

- Computer Science
- AISTATS
- 2005

Three semi-supervised algorithms are proposed: deriving graph-based distances that emphazise low density regions between clusters, followed by training a standard SVM, and optimizing the Transductive SVM objective function by gradient descent. Expand

Convex Methods for Transduction

- Computer Science, Mathematics
- NIPS
- 2003

This paper presents a relaxation of the 2-class transduction problem based on semi-definite programming (SDP), resulting in a convex optimization problem that has polynomial complexity in the size of the data set. Expand

Making large-scale support vector machine learning practical

- Computer Science
- 1998

This chapter presents algorithmic and computational results developed for SV M light V2.0, which make large-scale SVM training more practical and give guidelines for the application of SVMs to large domains. Expand

Maximum Margin Clustering

- Computer Science, Mathematics
- NIPS
- 2004

A new method for clustering based on finding maximum margin hyperplanes through data that leads naturally to a semi-supervised training method for support vector machines by maximizing the margin simultaneously on labeled and unlabeled training data. Expand

Fast Kernel Classifiers with Online and Active Learning

- Computer Science
- J. Mach. Learn. Res.
- 2005

This contribution presents an online SVM algorithm based on the premise that active example selection can yield faster training, higher accuracies, and simpler models, using only a fraction of the training example labels. Expand

A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2005

A fast method for solving linear SVMs with L2 loss function that is suited for large scale data mining tasks such as text classification is developed by modifying the finite Newton method of Mangasarian in several ways. Expand

On ψ-Learning

- Mathematics
- 2003

The concept of large margins have been recognized as an important principle in analyzing learning methodologies, including boosting, neural networks, and support vector machines (SVMs). However, this… Expand