#### Filter Results:

#### Publication Year

2013

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

The (1+1) EA with mutation probability <i>c/n</i>, where <i>c</i>>0 is an arbitrary constant, is studied for the classical OneMax function. Its expected optimization time is analyzed exactly (up to lower order terms) as a function of <i>c</i> and λ. It turns out that 1/<i>n</i> is the only optimal mutation probability if λ=<i>o</i>(ln <i>n</i>… (More)

We consider stochastic versions of OneMax and LeadingOnes and analyze the performance of evolutionary algorithms with and without populations on these problems. It is known that the (1+1) EA on OneMax performs well in the presence of very small noise, but poorly for higher noise levels. We extend these results to LeadingOnes and to many different noise… (More)

There is empirical evidence that memetic algorithms (MAs) can outperform plain evolutionary algorithms (EAs). Recently the first runtime analyses have been presented proving the aforementioned conjecture rigorously by investigating Variable-Depth Search, VDS for short (Sudholt, 2008). Sudholt raised the question if there are problems where VDS performs… (More)

We study the (1+λ) EA with mutation probability c/n, where c>0 is a constant, on the OneMax problem. Using an improved variable drift theorem, we show that upper and lower bounds on the expected runtime of the (1+λ) EA obtained from variable drift theorems are at most apart by a small lower order term if the exact drift is known. This reduces… (More)

The ( $$1+\lambda $$ 1 + λ ) EA with mutation probability c / n, where $$c>0$$ c > 0 is an arbitrary constant, is studied for the classical OneMax function. Its expected optimization time is analyzed exactly (up to lower order terms) as a function of c and $$\lambda $$ λ . It turns out that 1 / n is the only optimal mutation probability if $$\lambda =o(\ln… (More)

- ‹
- 1
- ›