Investigating the parameter space of evolutionary algorithms

@article{Sipper2018InvestigatingTP,
  title={Investigating the parameter space of evolutionary algorithms},
  author={Moshe Sipper and Weixuan Fu and Karuna Ahuja and Jason H. Moore},
  journal={BioData Mining},
  year={2018},
  volume={11}
}
Evolutionary computation (EC) has been widely applied to biological and biomedical data. The practice of EC involves the tuning of many parameters, such as population size, generation count, selection size, and crossover and mutation rates. Through an extensive series of experiments over multiple evolutionary algorithm implementations and 25 problems we show that parameter space tends to be rife with viable parameters, at least for the problems studied herein. We discuss the implications of… Expand
On the analysis of hyper-parameter space for a genetic programming system with iterated F-Race
TLDR
This work builds on recent findings and explores the hyper-parameter space of a specific GP system called neat-GP that controls model size using three variants of the iterated F-Race algorithm, for the first time applied to GP. Expand
Evolutionary computation: an investigation of parameter space
TLDR
It is shown that parameter space tends to be rife with viable parameters, somewhat in contrast with common lore, through an extensive series of experiments over multiple evolutionary algorithm implementations and 25 problems. Expand
What Can We Learn from Multi-Objective Meta-Optimization of Evolutionary Algorithms in Continuous Domains?
TLDR
It is shown that by using a multi-objective genetic algorithm to tune an EA, it is possible not only to find good parameter sets considering more objectives at the same time but also to derive generalizable results which can provide guidelines for designing EA-based applications. Expand
Correction to: Investigating the parameter space of evolutionary algorithms
Following publication of the original article [1], an error was reported in one of the experiments.
Self-Adaptation of Meta-Parameters for Lamarckian-Inherited Neuromodulated Neurocontrollers in the Pursuit-Evasion Game
TLDR
It is shown that self-adaptation can be used to automatically tune and control meta-parameters during evolution, and under some circumstances self- Adaptation may lead to improved performance of the evolutionary algorithm. Expand
Solution and Fitness Evolution (SAFE): A Study of Multiobjective Problems
TLDR
An investigation of SAFE’s adaptation and application to multiobjective problems, wherein candidate objective functions explore different weightings of each objective, suggests that SAFE, and the concept of coevolving solutions and objective functions, can identify a similar set of optimal multiObjective solutions without explicitly employing a Pareto front for fitness calculation and parent selection. Expand
Universal Learning Machine with Genetic Programming
TLDR
Experimental evidence is presented that UGP is actually able to improve the models produced by all the studied machine learning algorithms in isolation, on three complex real-life problems. Expand
pyGOURGS - global optimization of n-ary tree representable problems using uniform random global search
TLDR
This software is devised to allow us to perform uniform random global search, also known as pure random search, on these problems, and the challenge lies in creating a system that enumerates all possible solutions, and is able to randomly select from this space of solutions. Expand
Metaheuristics and Swarm Methods: A Discussion on Their Performance and Applications
TLDR
This chapter presents a discussion centered on several observable characteristics in nature-inspired methods and their influence on its overall performance, and presents a survey on some of the most important areas science and technology where nature- inspired algorithms have found applications. Expand
Genetic Algorithms for the Optimization of Diffusion Parameters in Content-Based Image Retrieval
TLDR
This work proposes to use genetic algorithms to find the optimal setting of all the diffusion parameters with respect to retrieval performance for each different dataset, and is faster than others used as references. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Comparing parameter tuning methods for evolutionary algorithms
  • S. Smit, A. Eiben
  • Computer Science
  • 2009 IEEE Congress on Evolutionary Computation
  • 2009
TLDR
The most important issues related to tuning EA parameters are discussed, a number of existing tuning methods are described, and a modest experimental comparison among them are presented, hopefully inspiring fellow researchers for further work. Expand
Parameter Setting in Evolutionary Algorithms
One of the main difficulties of applying an evolutionary algorithm (or, as a matter of fact, any heuristic method) to a given problem is to decide on an appropriate set of parameter values. TypicallyExpand
Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist
TLDR
It is demonstrated that REVAC can also tune an EA to a set of problems (a whole test suite) and obtain robust, rather than problem-tailored, parameter values and an EA that is a ‘generalist,rather than a’ ‘specialist’. Expand
Evolutionary Algorithm Parameters and Methods to Tune Them
In this chapter we discuss the notion of Evolutionary Algorithm (EAs) parameters and propose a distinction between EAs and EA instances, based on the type of parameters used to specify their details.Expand
Adaptation in evolutionary computation: a survey
TLDR
This paper develops a classification of adaptation on the basis of the mechanisms used, and the level at which adaptation operates within the evolutionary algorithm. Expand
Logistic regression for parameter tuning on an evolutionary algorithm
TLDR
This paper proposes the utilization of logistic regression, a statistical tool, for parameter tuning of an evolutionary algorithm called ProtoG, and the algorithm is applied to the traveling salesman problem. Expand
Parameter Setting in EAs: a 30 Year Perspective
  • K. A. Jong
  • Computer Science, Physics
  • Parameter Setting in Evolutionary Algorithms
  • 2007
TLDR
This chapter provides a historical overview of this issue, discussing both manual and automated approaches to Parameterized evolutionary algorithms, and suggesting when a particular strategy might be appropriate. Expand
Parameter Control in Evolutionary Algorithms
TLDR
A classification of different approaches based on a number of complementary features is provided, and special attention is paid to setting parameters on-the-fly, which has the potential of adjusting the algorithm to the problem while solving the problem. Expand
Meta-evolutionary programming
  • D. Fogel, L. Fogel, J. W. Atmar
  • Computer Science
  • [1991] Conference Record of the Twenty-Fifth Asilomar Conference on Signals, Systems & Computers
  • 1991
TLDR
The authors address incorporating a meta-level evolutionary programming that can simultaneously evolve optimal settings for these parameters while a search for the appropriate extrema is being conducted, and indicate the suitability of such a procedure. Expand
Evolutionary programming made faster
TLDR
A "fast EP" (FEP) is proposed which uses a Cauchy instead of Gaussian mutation as the primary search operator and is proposed and tested empirically, showing that IFEP performs better than or as well as the better of FEP and CEP for most benchmark problems tested. Expand
...
1
2
3
4
...