Fast re-optimization via structural diversity

@article{Doerr2019FastRV,
  title={Fast re-optimization via structural diversity},
  author={Benjamin Doerr and Carola Doerr and Frank Neumann},
  journal={Proceedings of the Genetic and Evolutionary Computation Conference},
  year={2019}
}
When a problem instance is perturbed by a small modification, one would hope to find a good solution for the new instance by building on a known good solution for the previous one. Via a rigorous mathematical analysis, we show that evolutionary algorithms, despite usually being robust problem solvers, can have unexpected difficulties to solve such re-optimization problems. When started with a random Hamming neighbor of the optimum, the (1+1) evolutionary algorithm takes Ω(n2) time to optimize… 
First Steps Towards a Runtime Analysis When Starting With a Good Solution
TLDR
It is shown that different algorithms profit to a very different degree from a better initialization and that the optimal parameterization of the algorithm can depend strongly on the quality of the initial solutions, and self- adjusting and randomized heavy-tailed parameter choices can be profitable.
Theory of Iterative Optimization Heuristics: From Black-Box Complexity over Algorithm Design to Parameter Control
TLDR
Improved bounds for the black-box complexity of the two best known benchmark problems in the theory of IOHs, OneMax and LeadingOnes are derived, and it is demonstrated how insights obtained from such black- box complexity studies can inspire the design of efficient optimization techniques.
Time Complexity Analysis of Randomized Search Heuristics for the Dynamic Graph Coloring Problem
TLDR
It is shown that tailoring mutation operators to parts of the graph where changes have occurred can significantly reduce the expected reoptimization time, and tailored algorithms cannot prevent exponential times in settings where the original algorithm is inefficient.
A tight runtime analysis for the (1 + (λ, λ)) GA on leadingones
TLDR
A rigorous runtime analysis of the (1 + (λ, λ) evolutionary algorithm with standard parameter settings is conducted and it is proved that for any dynamic choice of λ the bound of Θ(<i>n</i><sup>2</sup>) fitness evaluations still holds.
A gentle introduction to theory (for non-theoreticians)
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial
A gentle introduction to theory (for non-theoreticians)
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial
The Dynamic Travelling Thief Problem: Benchmarks and Performance of Evolutionary Algorithms
TLDR
A number of scenarios based on the Travelling Thief Problem are defined to enable research on the effect of dynamic changes to sub-components and show that -- depending on the instance, the magnitude of the change, and the algorithms in the portfolio -- it is preferable to either restart the optimisation from scratch or continue with the previously valid solutions.
Working principles of binary differential evolution
TLDR
A first fundamental analysis of the working principles of binary differential evolution (BDE) shows that unlike most other optimization paradigms, it is stable in the sense that neutral bit values are sampled with probability close to 1/2 for a long time, which enables BDE to optimize the most important bits very fast.

References

SHOWING 1-10 OF 54 REFERENCES
A new analysis method for evolutionary optimization of dynamic and noisy objective functions
TLDR
The results suggest that the typical way to find the optimum in such adverse settings is not via a steady approach of the optimum, but rather via an exceptionally fast approach after waiting for a rare phase of low dynamic changes or noise.
A Theory and Algorithms for Combinatorial Reoptimization
TLDR
A general framework for combinatorial repotimization is developed, encompassing classical objective functions as well as the goal of minimizing the transition cost from one solution to the other, distinguishing here for the first time between classes of reoptimization problems by their hardness status with respect to the objective of minimizing transition costs.
Reoptimization times of evolutionary algorithms on linear functions under dynamic uniform constraints
TLDR
This paper studies the classical (1+1) EA and population-based algorithms and shows that they recompute an optimal solution very efficiently and that a variant of the (1+(λ, λ)) GA can recompute the optimal solution more efficiently in some cases.
Maintaining 2-Approximations for the Dynamic Vertex Cover Problem Using Evolutionary Algorithms
TLDR
This paper examines a dynamic version of the classical vertex cover problem and analyse evolutionary algorithms with respect to their ability to maintain a 2-approximation and points out that the third approach is very effective in maintaining 2- approximations for the dynamic vertex coverproblem.
Runtime analysis of randomized search heuristics for the dynamic weighted vertex cover problem
TLDR
A dynamic model of the classic Weighted Vertex Cover problem is presented and the performances of the two well-studied algorithms Randomized Local Search and (1+1) EA adapted to it are analyzed to contribute to the theoretical understanding of evolutionary computing for problems with dynamic changes.
Better Runtime Guarantees via Stochastic Domination
TLDR
This work argues that stochastic domination is a notion that should be used more frequently in this area of runtime analysis, and proves a fitness level theorem which shows that the runtime is dominated by a sum of independent geometric random variables.
On the robustness of evolutionary algorithms to noise: refined results and an example where noise helps
TLDR
The (1+1) EA on LeadingOnes is much more sensitive to noise than previously thought and offspring populations of size λ ≥ 3.42 log n can effectively deal with much higher noise than known before.
Design and analysis of migration in parallel evolutionary algorithms
TLDR
A first rigorous runtime analysis for island models is performed and a function where phases of independent evolution as well as communication among the islands are essential is constructed, leading to new insights into the usefulness of migration, how information is propagated in island models, and how to set parameters such as the migration interval.
On the Performance of Baseline Evolutionary Algorithms on the Dynamic Knapsack Problem
TLDR
The results show that the multi-objective approaches using a population that caters for dynamic changes have a clear advantage on many benchmarks scenarios when the frequency of changes is not too high.
Analyzing Evolutionary Algorithms
  • T. Jansen
  • Computer Science
    Natural Computing Series
  • 2013
TLDR
The author provides an introduction to the methods used to analyze evolutionary algorithms and other randomized search heuristics with a complexity-theoretical perspective, derives general limitations for black-box optimization, yielding lower bounds on the performance of evolutionary algorithms.
...
1
2
3
4
5
...