• Corpus ID: 221655059

Simple Simultaneous Ensemble Learning in Genetic Programming

  title={Simple Simultaneous Ensemble Learning in Genetic Programming},
  author={M. Virgolin},
  • M. Virgolin
  • Published 13 September 2020
  • Computer Science
  • ArXiv
Learning ensembles by bagging can substantially improve the generalization performance of low-bias high-variance estimators, including those evolved by Genetic Programming (GP). Yet, the best way to learn ensembles in GP remains to be determined. This work attempts to fill the gap between existing GP ensemble learning algorithms, which are often either simple but expensive, or efficient but complex. We propose a new algorithm that is both simple and efficient, named Simple Simultaneous Ensemble… 

Figures and Tables from this paper



An automated ensemble learning framework using genetic programming for image classification

This study is the first work using GP to automatically generate ensembles for image classification using a novel program structure, new function set and a new terminal set developed in EGP.

Genetically Evolved Trees Representing Ensembles

In this paper, which is the first extensive study of GEMS, the representation language is extended to include tests partitioning the data, further increasing flexibility and several micro techniques are applied to reduce overfitting.

Evolving Diverse Ensembles Using Genetic Programming for Classification With Unbalanced Data

Experimental results on six (binary) class imbalance problems show that the evolved ensembles outperform their individual members, as well as single-predictor methods such as canonical GP, naive Bayes, and support vector machines, on highly unbalanced tasks.

Ensemble Genetic Programming

Inspired by the success of top state-of-the-art machine learning methods, a new Genetic Programming method called Ensemble GP is developed, able to achieve exceptionally good generalization results on a particularly hard problem where none of the other methods was able to succeed.

Scalable genetic programming by gene-pool optimal mixing and input-space entropy-based building-block learning

On a set of well-known benchmark problems, GP-GOMEA outperforms standard GP while being on par with more recently introduced, state-of-the-art EAs, and introduces Input-space Entropy-based Building-block Learning (IEBL), a novel approach to identifying and encapsulating relevant building blocks (subroutines) into new terminals and functions.

Improving Model-Based Genetic Programming for Symbolic Regression of Small Expressions

This article shows that the non-uniformity in the distribution of the genotype in GP populations negatively biases LL, and proposes a method to correct this, and finds that GOMEA is a promising new approach to SR.

On Explaining Machine Learning Models by Evolving Crucial and Compact Features

Benchmarking state-of-the-art symbolic regression algorithms

This paper conceptually and experimentally compare several representatives of multiple linear regression algorithms, including GPTIPS, FFX, and EFS, which are applied as off-the-shelf, ready-to-use techniques in the field of SR.

Multi-objective gene-pool optimal mixing evolutionary algorithms

This work modifications the linkage learning procedure and the variation operator of GOMEAs to better suit the need of finding the whole Pareto-optimal front rather than a single best solution, and constructs a multi-objective GOMEA (MO-GOMEA).

Popular Ensemble Methods: An Empirical Study

This work suggests that most of the gain in an ensemble's performance comes in the first few classifiers combined; however, relatively large gains can be seen up to 25 classifiers when Boosting decision trees.