Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution
@article{Antipov2021LazyPT, title={Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution}, author={Denis Antipov and Maxim Buzdalov and Benjamin Doerr}, journal={Proceedings of the Genetic and Evolutionary Computation Conference}, year={2021} }
Most evolutionary algorithms have multiple parameters and their values drastically affect the performance. Due to the often complicated interplay of the parameters, setting these values right for a particular problem is a challenging task. This task becomes even more complicated when the optimal parameter values change significantly during the run of the algorithm since then a dynamic parameter choice is necessary. In this work, we propose a lazy but effective solution, namely choosing all…
Figures and Tables from this paper
21 Citations
Larger Offspring Populations Help the $(1 + (\lambda, \lambda))$ Genetic Algorithm to Overcome the Noise
- Computer Science
- 2023
Surprisingly, in many situations this algorithm is even more robust to noise than the $(1+\lambda)$~EA, and on several classic benchmark problems shows that this difficulty does not arise.
How the Move Acceptance Hyper-Heuristic Copes With Local Optima: Drastic Differences Between Jumps and Cliffs
- Computer ScienceArXiv
- 2023
It is proved that for any choice of the MAHH selection parameter~$p, the expected runtime of theMAHH on a jump function with gap size $m = o(n^{1/2})$ is at least $\Omega(n-1} / (2m-1)!', which renders the MA HH much slower than simple elitist evolutionary algorithms with their typical $O(n^m)$ runtime.
Tight Runtime Bounds for Static Unary Unbiased Evolutionary Algorithms on Linear Functions
- Computer Science, MathematicsArXiv
- 2023
Witt's result generalizes if standard bit mutation is replaced by an arbitrary unbiased mutation operator, and includes some of the heavy-tail mutation operators used in fast genetic algorithms, but not all of them.
Theoretical and Empirical Analysis of Parameter Control Mechanisms in the (1 + (λ, λ)) Genetic Algorithm
- Computer ScienceACM Trans. Evol. Learn. Optim.
- 2022
This work presents the first proof of a bimodal parameter landscape for the runtime of an evolutionary algorithm on a multimodal problem, and shows that the self-adjusting (1 + (λ, λ) GA performs as well as the ( 1 + 1) EA with the optimal mutation rate and evolutionary algorithms with heavy-tailed mutation, apart from a small polynomial overhead.
The $(1+(\lambda,\lambda))$ Genetic Algorithm on the Vertex Cover Problem: Crossover Helps Leaving Plateaus
- Computer Science2022 IEEE Congress on Evolutionary Computation (CEC)
- 2022
This work found that the recently proposed $(1+(\lambda, \lambda))$ genetic algorithm solves certain instances of this problem, including those that are hard to heuristic solvers, much faster than simpler mutation-only evolutionary algorithms.
Runtime Analysis for Permutation-based Evolutionary Algorithms
- Computer ScienceArXiv
- 2022
A rigorous runtime analysis of the permutation-based $(1+1)$ EA proposed by Scharnow, Tinnefeld, and Wegener (2004) on the analogues of the LeadingOnes and Jump benchmarks shows that a heavy-tailed version of the scramble operator leads to a speed-up of order $m^{\Theta(m)}$ on jump functions with jump size $m$.
On optimal static and dynamic parameter choices for fixed-target optimization
- Computer ScienceGECCO
- 2022
This paper investigates static and dynamic parameter choices in fixed-target settings, using a mixture of exact theory-driven computations and experimental evaluation, and finds few remarkably generic trends, some of which may explain a number of misconceptions found in evolutionary computation.
Fast non-elitist evolutionary algorithms with power-law ranking selection
- Computer Science
- 2022
This work shows that a non-elitist EA with power-law ranking selection leads to fast runtime on easy benchmark problems, while maintaining the capability of escaping certain local optima where the elitist EAs spend exponential time in the expectation, and derives the error threshold and shows extreme tolerance to high mutation rates.
The (1 + (λ, λ)) global SEMO algorithm
- Computer ScienceGECCO
- 2022
The (1 + (λ, λ)) genetic algorithm is a recently proposed single-objective evolutionary algorithm with several interesting properties. We show that its main working principle, mutation with a high…
60 References
Fast Mutation in Crossover-Based Algorithms
- Computer ScienceAlgorithmica
- 2022
A genetic algorithm optimizing the OneMax benchmark function is shown that with a heavy-tailed mutation rate a linear runtime can be achieved, which is asymptotically faster than what can be obtained with any static mutation rate.
From black-box complexity to designing new genetic algorithms
- Computer ScienceTheor. Comput. Sci.
- 2015
Does Comma Selection Help to Cope with Local Optima?
- Materials ScienceAlgorithmica
- 2022
One hope when using non-elitism in evolutionary computation is that the ability to abandon the current-best solution aids leaving local optima. To improve our understanding of this mechanism, we…
Optimal Static and Self-Adjusting Parameter Choices for the (1+(λ,λ))\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(
- Materials ScienceAlgorithmica
- 2017
The (1+(λ,λ))\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek}…
Fast non-elitist evolutionary algorithms with power-law ranking selection
- Computer ScienceGECCO
- 2022
This work shows that a non-elitist EA with power-law ranking selection leads to fast runtime on easy benchmark problems, while maintaining the capability of escaping certain local optima where the elitist EAs spend exponential time in the expectation, and derives the error threshold and shows extreme tolerance to high mutation rates.
A First Runtime Analysis of the NSGA-II on a Multimodal Problem
- Computer SciencePPSN
- 2022
Overall, this work shows that the NSGA-II copes with the local optima of the OneJumpZeroJump problem at least as well as the global SEMO algorithm.
Stagnation Detection meets Fast Mutation
- Computer ScienceEvoCOP
- 2022
This work proposes a mutation strategy that combines ideas of both mechanisms and can also obtain the best possible probability of finding a single distant solution and shows that it can outperform both the stagnation-detection approach and fast mutation.
Automatic adaptation of hypermutation rates for multimodal optimisation
- Computer ScienceFOGA
- 2021
This paper performs rigorous time complexity analyses for standard multimodal benchmark functions with significant characteristics and proves that the proposed algorithm can learn to adapt the mutation rate appropriately such that both ageing and hypermutation are effective when they are most useful for escaping local optima.
On crossing fitness valleys with majority-vote crossover and estimation-of-distribution algorithms
- Computer ScienceFOGA
- 2021
This paper investigates variants of the Jump function where the gap is shifted and appears in the middle of the typical search trajectory, and derives limits on the gap size allowing efficient runtimes for the EDA.
A rigorous runtime analysis of the 2-MMASib on jump functions: ant colony optimizers can cope well with local optima
- Computer ScienceGECCO
- 2021
This first runtime analysis of a basic ACO algorithm on a classic multimodal benchmark shows that simple ACO algorithms can cope much better with local optima than many evolutionary algorithms, which need Ω(nk) time.