Genetic Programming for Kernel-Based Learning with Co-evolving Subsets Selection

  title={Genetic Programming for Kernel-Based Learning with Co-evolving Subsets Selection},
  author={Christian Gagn{\'e} and Marc Schoenauer and Mich{\`e}le Sebag and Marco Tomassini},
Support Vector Machines (SVMs) are well-established Machine Learning (ML) algorithms. They rely on the fact that i) linear learning can be formalized as a well-posed optimization problem; ii) non-linear learning can be brought into linear learning thanks to the kernel trick and the mapping of the initial search space onto a high dimensional feature space. The kernel is designed by the ML expert and it governs the efficiency of the SVM approach. In this paper, a new approach for the automatic… 

Learning SVM with Complex Multiple Kernels Evolved by Genetic Programming

The numerical experiments show that the SVM involving the evolutionary complex multiple kernels perform better than the classic simple kernels and on the considered data sets, the new multiple kernels outperform both the cLMK and eLMK — linear multiple kernels.

Kernel evolution for support vector classification

A novel approach is proposed to use genetic programming (GP) to design domain-specific and optimal kernel functions for support vector classification (SVC) which automatically adjusts the parameters.

Improving classification performance of Support Vector Machine by genetically optimising kernel shape and hyper-parameters

Numerical experiments show that the SVM algorithm, involving the evolutionary kernel of kernels (eKoK) the authors propose, performs better than well-known classic kernels whose parameters were optimised and a state of the art convex linear and an evolutionary linear, respectively, kernel combinations.

Evolutionary Optimization of Least-Squares Support Vector Machines

Empirical studies show that this model indeed increases the generalization performance of the machine, although this improvement comes at a high computational cost, which suggests that the approach may be justified primarily in applications where prediction errors can have severe consequences, such as in medical settings.

Evolutionary combination of kernels for nonlinear feature transformation

Evolving kernels for support vector machine classification

A new algorithm, called KGP, is introduced, which finds near-optimal kernels using strongly typed genetic programming and principled kernel closure properties, and shows wide applicability.

Multi-optimization improves genetic programming generalization ability

This paper motivated and empirically shown that GP using a Pareto multi-optimization on the training set has a remarkably higher generalization ability than canonic or standard GP (besides counteracting bloat in a more efficient way and maintaining a higher diversity inside the population).

GEEK: Grammatical Evolution for Automatically Evolving Kernel Functions

GEEK is proposed, a Grammatical Evolution approach for automatically Evolving Kernel functions that uses a grammar composed of simple mathematical operations extracted from known kernels and is also able to optimize some of their parameters.

Creation of Specific-to-Problem Kernel Functions for Function Approximation

An initial framework to create specific-to-problem kernels for application to regression models using a modified version of a non parametric noise estimator is studied.



The Genetic Kernel Support Vector Machine: Description and Evaluation

This paper proposes a classification technique, which it is called the Genetic Kernel SVM (GK SVM), that uses Genetic Programming to evolve a kernel for a SVM classifier.

Evolutionary tuning of multiple SVM parameters

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.

Experiments with a New Boosting Algorithm

This paper describes experiments carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems and compared boosting to Breiman's "bagging" method when used to aggregate various classifiers.

Margin based feature selection - theory and algorithms

This paper introduces a margin based feature selection criterion and applies it to measure the quality of sets of features and devise novel selection algorithms for multi-class classification problems and provide theoretical generalization bound.

Kernel Nearest-Neighbor Algorithm

Experiments show that kernel nearest-neighbor algorithm is more powerful than conventional nearest-northern neighbour algorithm, and it can compete with SVM.

Dynamic Training Subset Selection for Supervised Learning in Genetic Programming

This paper describes how to reduce the number of function-tree evaluations by selecting a small subset of the training data set on which to actually carry out the GP algorithm.

Training genetic programming on half a million patterns: an example from anomaly detection

The hierarchical RSS-DSS algorithm is introduced for dynamically filtering large datasets based on the concepts of training pattern age and difficulty, while utilizing a data structure to facilitate

Kernel Methods for Pattern Analysis

This book provides an easy introduction for students and researchers to the growing field of kernel-based pattern analysis, demonstrating with examples how to handcraft an algorithm or a kernel for a new specific application, and covering all the necessary conceptual and mathematical tools to do so.

Genetic programming - on the programming of computers by means of natural selection

  • J. Koza
  • Computer Science
    Complex adaptive systems
  • 1993
This book discusses the evolution of architecture, primitive functions, terminals, sufficiency, and closure, and the role of representation and the lens effect in genetic programming.