Learn More
We consider the problem of optimizing functions corrupted with additive noise. It is known that Evolutionary Algorithms can reach a Simple Regret <i>O</i>(1/&#8730;<i>n</i>) within logarithmic factors, when <i>n</i> is the number of function evaluations. Here, Simple Regret at evaluation $n$ is the difference between the evaluation of the function at the(More)
Noisy optimization is the optimization of objective functions corrupted by noise. A portfolio of solvers is a set of solvers equipped with an algorithm selection tool for distributing the computational power among them. Portfolios are widely and successfully used in combinatorial optimization. In this work, we study portfolios of noisy optimization solvers.(More)
Various papers have analyzed the noisy optimization of convex functions. This analysis has been made according to several criteria used to evaluate the performance of algorithms: uniform rate, simple regret and cumulative regret. We propose an iterative optimization framework, a particular instance of which, using Hessian approximations, provably (i)(More)
It is known that evolution strategies in continuous domains might not converge in the presence of noise [3,14]. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise [4] and recover convergence. We show new sufficient conditions for the convergence of an evolutionary(More)
The performance measure of an algorithm is a crucial part of its analysis. The performance can be determined by the study on the convergence rate of the algorithm in question. It is necessary to study some (hopefully convergent) sequence that will measure how "good" is the approximated optimum compared to the real optimum. The concept of <i>Regret</i> is(More)
The optimization of capacities in large scale power systems is a stochastic problem, because the need for storage and connections (i.e. exchange capacities) varies a lot from one week to another (e.g. power generation is subject to the vagaries of wind) and from one winter to another (e.g. water inflows due to snow melting). It is usually tackled through(More)
We study mathematically and experimentally the convergence rate of differential evolution and particle swarm optimization for simple unimodal functions. Due to parallelization concerns, the focus is on lower bounds on the runtime, i.e. upper bounds on the speed-up, as a function of the population size. Two cases are particularly relevant: A population size(More)
Much work has been devoted to the computational complexity of games. However, they are not necessarily relevant for estimating the complexity in human terms. Therefore, human-centered measures have been proposed, e.g. the depth. This paper discusses the depth of various games, extends it to a continuous measure. We provide new depth results and present tool(More)
In an optimization framework, some criteria might be more relevant than others; the internal computational cost of the optimization algorithm might be negligible or not; the quality of intermediate search points might be important or not. For this reason measuring the performance of an algorithm is a delicate task. In addition, the usual criteria are often(More)