• Corpus ID: 235377013

EXPObench: Benchmarking Surrogate-based Optimisation Algorithms on Expensive Black-box Functions

  title={EXPObench: Benchmarking Surrogate-based Optimisation Algorithms on Expensive Black-box Functions},
  author={Laurens Bliek and Arthur Guijt and Rickard Karlsson and Sicco Verwer and Mathijs de Weerdt},
Surrogate algorithms such as Bayesian optimisation are especially designed for black-box optimisation problems with expensive objectives, such as hyperparameter tuning or simulation-based optimisation. In the literature, these algorithms are usually evaluated with synthetic benchmarks which are well established but have no expensive objective, and only on one or two real-life applications which vary wildly between papers. There is a clear lack of standardisation when it comes to benchmarking… 

Figures and Tables from this paper

HPOBench: A Collection of Reproducible Multi-Fidelity Benchmark Problems for HPO
HBPOBench is proposed, which includes 7 existing and 5 new benchmark families, with in total more than 100 multi-fidelity benchmark problems, and provides surrogate and tabular benchmarks for computationally affordable yet statistically sound evaluations.


Identifying Properties of Real-World Optimisation Problems through a Questionnaire
This work investigates the properties of real-world problems through a questionnaire to enable the design of future benchmark problems that more closely resemble those found in the real world.
RBFOpt: an open-source library for black-box optimization with costly function evaluations
The two main methodological contributions of this paper are an approach to exploit a noisy but less expensive oracle to accelerate convergence to the optimum of the exact oracle, and the introduction of an automatic model selection phase during the optimization process.
The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
It is argued that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them.
Making a case for (Hyper-)parameter tuning as benchmark problems
A preliminary landscape analysis of two hyper-parameter selection problems is performed, and it is indicated that some parameter tuning problems might not be very well represented by the BBOB functions.
Comparison of parallel surrogate-assisted optimization approaches
The experiments indicate that a hybrid approach works best, which proposes candidate solutions based on different surrogate model-based infill criteria and evolutionary operators, which are applied to the electrostatic precipitator problem.
DACBench: A Benchmark Library for Dynamic Algorithm Configuration
DACBench is proposed, a benchmark library that seeks to collect and standardize existing DAC benchmarks from different AI domains, as well as provide a template for new ones to show the potential, broad applicability and challenges of DAC.
Empirical review of standard benchmark functions using evolutionary global optimization
We have employed a recent implementation of genetic algorithms to study a range of standard benchmark functions for global optimization. It turns out that some of them are not very useful as
A Surrogate Modeling and Adaptive Sampling Toolbox for Computer Based Design
This paper presents a mature, flexible, and adaptive machine learning toolkit for regression modeling and active learning to tackle issues of computational cost and model accuracy.
IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics
IOHprofiler is a new tool for analyzing and comparing iterative optimization heuristics that provides as output a statistical evaluation of the algorithms' performance by means of the distribution on the fixed-target running time and theFixed-budget function values.
Model-based methods for continuous and discrete global optimization