Normalization in Support Vector Machines

  title={Normalization in Support Vector Machines},
  author={Arnulf B. A. Graf and Silvio Borer},
This article deals with various aspects of normalization in the context of Support Vector Machines. We consider fist normalization of the vectors in the input space and point out the inherent limitations. A natural extension to the feature space is then represented by the kernel function normalization. A correction of the position of the Optimal Separating Hyperplane is subsequently introduced so as to suit better these normalized kernels. Numerical experiments finally evaluate the different… 

Classification in a normalized feature space using support vector machines

This paper discusses classification using support vector machines in a normalized feature space using the fact that in this setting all points lie on the surface of a unit hypersphere to replace the optimal separating hyperplane by one that is symmetric in its angles, leading to an improved estimator.

Improved Support Vector Machine Generalization Using Normalized Input Space

This research examines the normalization affect across 112 classification problems with SVM using the rbf kernel, and observes a significant classification improvement due to normalization.

Support vector machines: A distance-based approach to multi-class classification

  • Wissam AoudiA. Barbar
  • Computer Science
    2016 IEEE International Multidisciplinary Conference on Engineering Technology (IMCET)
  • 2016
This paper proposes a new approach to constructing the multi-class classification function, where the structure and properties of the support vectors are exploited without altering the training procedure, and argues that the proposed distance- based method represents a more rigorous and intuitive measure than the traditional hyperplane-based method.

A generalized-space expansion of Support Vector Machines for diagnostic systems

  • I. DimouM. Zervakis
  • Computer Science
    Proceedings of the 10th IEEE International Conference on Information Technology and Applications in Biomedicine
  • 2010
This work applies a series of composite kernel extensions stemming from nonlinear second-level kernels to standard diagnostic problems to create a formulation that can accept arbitrary non-positive definite feature kernels.

Sentiment Classification with Support Vector Machines and Multiple Kernel Functions

This paper analyzes and compares various non-negative linear combination kernels applied on product reviews to determine whether a review is positive or negative and shows that the performance of the combination kernels that outperforms the single kernels.

Properties of the sample estimators used for statistical normalization of feature vectors

A mathematical justification of the statistical normalization procedure is given here and the sample estimators proposed for normalization of attributes of feature vectors are proven to have desirable properties, namely they are consistent and unbiased.

Support Vector Machine Approach and Petroleum Engineering

A brief description of Support Vector Machine method is first brought and after that some important implementations in petroleum engineering are discussed shortly.

Simple Techniques Make Sense: Feature Pooling and Normalization for Image Classification

This paper discusses the feature summarization stage, including pooling and normalization, in the BoF model and presents two algorithms, i.e., generalized regular spatial pooling for constructing a better group of spatial bins and hierarchical feature normalization for assigning proper weights for regional featurenormalization.



Estimating the Support of a High-Dimensional Distribution

The algorithm is a natural extension of the support vector algorithm to the case of unlabeled data by carrying out sequential optimization over pairs of input patterns and providing a theoretical analysis of the statistical performance of the algorithm.

A PAC-Bayesian margin bound for linear classifiers

A bound on the generalisation error of linear classifiers in terms of a refined margin quantity on the training set is presented in a PAC-Bayesian framework and is based on geometrical arguments in the space oflinear classifiers.

On the Computational Power of Winner-Take-All

  • W. Maass
  • Computer Science
    Neural Computation
  • 2000
The theoretical analysis shows that winner- take-all is a surprisingly powerful computational module in comparison with threshold gates (also referred to as McCulloch-Pitts neurons) and sigmoidal gates, and proves an optimal quadratic lower bound for computing winner-takeall in any feedforward circuit consisting of threshold gates.

Advances in Kernel Methods

  • Biology
  • 1998
This dissertation aims to provide a history of web exceptionalism from 1989 to 2002, a period chosen in order to explore its roots as well as specific cases up to and including the year in which descriptions of “Web 2.0” began to circulate.

Support vector learning

This book provides a comprehensive analysis of what can be done using Support vector Machines, achieving record results in real-life pattern recognition problems, and proposes a new form of nonlinear Principal Component Analysis using Support Vector kernel techniques, which it is considered as the most natural and elegant way for generalization of classical Principal Component analysis.

Support Vector Machines for 3D Object Recognition

The proposed system does not require feature extraction and performs recognition on images regarded as points of a space of high dimension without estimating pose, indicating that SVMs are well-suited for aspect-based recognition.

Neural Networks: a Comprehensive Approach

  • Neural Networks: a Comprehensive Approach
  • 1999

Support Vector Learning. R. Oldenburg Verlag

  • Support Vector Learning. R. Oldenburg Verlag
  • 1997