Genetic Programming for Kernel-Based Learning with Co-evolving Subsets Selection

  title={Genetic Programming for Kernel-Based Learning with Co-evolving Subsets Selection},
  author={Christian Gagn{\'e} and Marc Schoenauer and Mich{\`e}le Sebag and Marco Tomassini},
Support Vector Machines (SVMs) are well-established Machine Learning (ML) algorithms. They rely on the fact that i) linear learning can be formalized as a well-posed optimization problem; ii) non-linear learning can be brought into linear learning thanks to the kernel trick and the mapping of the initial search space onto a high dimensional feature space. The kernel is designed by the ML expert and it governs the efficiency of the SVM approach. In this paper, a new approach for the automatic… 

Tuning and evolution of support vector kernels

Two solutions for optimizing kernel functions are presented: automated hyperparameter tuning of kernel functions combined with an optimization of pre- and post-processing options by Sequential Parameter Optimization (SPO) and evolving new kernel functions by Genetic Programming (GP).

Learning SVM with Complex Multiple Kernels Evolved by Genetic Programming

The numerical experiments show that the SVM involving the evolutionary complex multiple kernels perform better than the classic simple kernels and on the considered data sets, the new multiple kernels outperform both the cLMK and eLMK — linear multiple kernels.

Kernel evolution for support vector classification

A novel approach is proposed to use genetic programming (GP) to design domain-specific and optimal kernel functions for support vector classification (SVC) which automatically adjusts the parameters.

Improving classification performance of Support Vector Machine by genetically optimising kernel shape and hyper-parameters

Numerical experiments show that the SVM algorithm, involving the evolutionary kernel of kernels (eKoK) the authors propose, performs better than well-known classic kernels whose parameters were optimised and a state of the art convex linear and an evolutionary linear, respectively, kernel combinations.

Evolutionary Optimization of Least-Squares Support Vector Machines

Empirical studies show that this model indeed increases the generalization performance of the machine, although this improvement comes at a high computational cost, which suggests that the approach may be justified primarily in applications where prediction errors can have severe consequences, such as in medical settings.

Evolutionary combination of kernels for nonlinear feature transformation

Evolving kernels for support vector machine classification

A new algorithm, called KGP, is introduced, which finds near-optimal kernels using strongly typed genetic programming and principled kernel closure properties, and shows wide applicability.

In-depth analysis of SVM kernel learning and its components

This paper identifies all the factors that affect the final performance of support vector machines in relation to the elicitation of kernels and studies the influence each component has on the final classification performance, providing recommendations and insights into the kernel setting for support Vector machines.

GEEK: Grammatical Evolution for Automatically Evolving Kernel Functions

GEEK is proposed, a Grammatical Evolution approach for automatically Evolving Kernel functions that uses a grammar composed of simple mathematical operations extracted from known kernels and is also able to optimize some of their parameters.



The Genetic Kernel Support Vector Machine: Description and Evaluation

This paper proposes a classification technique, which it is called the Genetic Kernel SVM (GK SVM), that uses Genetic Programming to evolve a kernel for a SVM classifier.

Evolutionary tuning of multiple SVM parameters

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.

Experiments with a New Boosting Algorithm

This paper describes experiments carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems and compared boosting to Breiman's "bagging" method when used to aggregate various classifiers.

Margin based feature selection - theory and algorithms

This paper introduces a margin based feature selection criterion and applies it to measure the quality of sets of features and devise novel selection algorithms for multi-class classification problems and provide theoretical generalization bound.

Kernel Nearest-Neighbor Algorithm

Experiments show that kernel nearest-neighbor algorithm is more powerful than conventional nearest-northern neighbour algorithm, and it can compete with SVM.

Dynamic Training Subset Selection for Supervised Learning in Genetic Programming

This paper describes how to reduce the number of function-tree evaluations by selecting a small subset of the training data set on which to actually carry out the GP algorithm.

Training genetic programming on half a million patterns: an example from anomaly detection

The hierarchical RSS-DSS algorithm is introduced for dynamically filtering large datasets based on the concepts of training pattern age and difficulty, while utilizing a data structure to facilitate

Distance Metric Learning for Large Margin Nearest Neighbor Classification

This paper shows how to learn a Mahalanobis distance metric for kNN classification from labeled examples in a globally integrated manner and finds that metrics trained in this way lead to significant improvements in kNN Classification.

Kernel Methods for Pattern Analysis

This book provides an easy introduction for students and researchers to the growing field of kernel-based pattern analysis, demonstrating with examples how to handcraft an algorithm or a kernel for a new specific application, and covering all the necessary conceptual and mathematical tools to do so.