COCO: a platform for comparing continuous optimizers in a black-box setting

  title={COCO: a platform for comparing continuous optimizers in a black-box setting},
  author={N. Hansen and A. Auger and Olaf Mersmann and T. Tusar and D. Brockhoff},
  journal={Optimization Methods and Software},
  pages={114 - 144}
  • N. Hansen, A. Auger, +2 authors D. Brockhoff
  • Published 2021
  • Computer Science, Mathematics
  • Optimization Methods and Software
  • We introduce COCO, an open-source platform for Comparing Continuous Optimizers in a black-box setting. COCO aims at automatizing the tedious and repetitive task of benchmarking numerical optimization algorithms to the greatest possible extent. The platform and the underlying methodology allow to benchmark in the same framework deterministic and stochastic solvers for both single and multiobjective optimization. We present the rationals behind the (decade-long) development of the platform as a… CONTINUE READING
    151 Citations

    Topics from this paper

    Mixed-integer benchmark problems for single- and bi-objective optimization
    • 10
    • PDF
    Benchmarking for Metaheuristic Black-Box Optimization: Perspectives and Open Challenges
    • R. Sala, Ralf Müller
    • Computer Science, Mathematics
    • 2020 IEEE Congress on Evolutionary Computation (CEC)
    • 2020
    • PDF
    Benchmarking discrete optimization heuristics with IOHprofiler
    • 20
    • PDF
    Benchmarking the Pure Random Search on the Bi-objective BBOB-2016 Testbed
    • 2
    • PDF
    The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
    • 5
    • PDF


    Benchmarking Numerical Multiobjective Optimizers Revisited
    • 31
    • PDF
    jMetal: A Java framework for multi-objective optimization
    • 881
    Benchmarking RM-MEDA on the Bi-objective BBOB-2016 Test Suite
    • 2
    • PDF
    PAVER 2.0: an open source environment for automated performance analysis of benchmarking data
    • 14
    • PDF