• Corpus ID: 216552997

MATE: A Model-based Algorithm Tuning Engine

@article{Yafrani2020MATEAM,
  title={MATE: A Model-based Algorithm Tuning Engine},
  author={Mohamed El Yafrani and Marcella Scoczynski Ribeiro Martins and Inkyung Sung and Markus Wagner and Carola Doerr and Peter Nielsen},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.12750}
}
In this paper, we introduce a Model-based Algorithm Turning Engine, namely MATE, where the parameters of an algorithm are represented as expressions of the features of a target optimisation problem. In contrast to most static (feature-independent) algorithm tuning engines such as irace and SPOT, our approach aims to derive the best parameter configuration of a given algorithm for a specific problem, exploiting the relationships between the algorithm parameters and the features of the problem… 

Figures and Tables from this paper

Improved regression models for algorithm configuration

TLDR
A simple yet effective linear model is proposed, which approximates linear relations between instance size and optimal parameter values and piecewise and log-log linear models for modeling nonlinear relations.

A Survey of Methods for Automated Algorithm Configuration

TLDR
A review of existing AC literature within the lens of taxonomies, which outlines relevant design choices of configuration approaches, contrast methods and problem variants against each other, and provides a look at future research directions in the field of AC.

References

SHOWING 1-10 OF 35 REFERENCES

Sequential Model-Based Optimization for General Algorithm Configuration

TLDR
This paper extends the explicit regression models paradigm for the first time to general algorithm configuration problems, allowing many categorical parameters and optimization for sets of instances, and yields state-of-the-art performance.

Algorithm runtime prediction: Methods & evaluation

Feature Based Algorithm Configuration: A Case Study with Differential Evolution

TLDR
This paper empirically investigates the relationship between continuous problem features and the best parameter configuration of a given stochastic algorithm over a bench of test functions — namely here, the original version of Differential Evolution over the BBOB test bench.

Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms

TLDR
It is demonstrated for the first time how information about an algorithm's parameter settings can be incorporated into a model, and how such models can be used to automatically adjust the algorithm's parameters on a per-instance basis in order to optimize its performance.

A Gender-Based Genetic Algorithm for the Automatic Configuration of Algorithms

TLDR
A robust, inherently parallel genetic algorithm is proposed for the problem of configuring solvers automatically and a gender separation is introduced to cope with the high costs of evaluating the fitness of individuals.

Practical Bayesian Optimization of Machine Learning Algorithms

TLDR
This work describes new algorithms that take into account the variable cost of learning algorithm experiments and that can leverage the presence of multiple cores for parallel experimentation and shows that these proposed algorithms improve on previous automatic procedures and can reach or surpass human expert-level optimization for many algorithms.

Optimal Fixed and Adaptive Mutation Rates for the LeadingOnes Problem

We reconsider a classical problem, namely how the (1+1) evolutionary algorithm optimizes the LEADINGONES function. We prove that if a mutation probability of p is used and the problem size is n, then

Per instance algorithm configuration of CMA-ES with limited budget

TLDR
This paper presents a case study in the continuous black-box optimization domain, using features proposed in the literature to outperform the default setting of CMA-ES with as few as 30 or 50 time the problem dimension additional function evaluations for feature computation.

Towards Landscape-Aware Automatic Algorithm Configuration: Preliminary Experiments on Neutral and Rugged Landscapes

TLDR
Fitness landscape analysis can open a whole set of new research opportunities for increasing the effectiveness of existing automatic algorithm configuration methods, and shows that a landscape-aware approach is a viable alternative to handle the heterogeneity of (black-box) combinatorial optimization problems.