• Corpus ID: 248666069

Consensus-based optimization methods converge globally

@inproceedings{Fornasier2021ConsensusbasedOM,
  title={Consensus-based optimization methods converge globally},
  author={Massimo Fornasier and Timo Klock and Konstantin Riedl},
  year={2021}
}
In this paper we study consensus-based optimization (CBO), which is a multi-agent metaheuristic derivative-free optimization method that can globally minimize nonconvex nonsmooth functions and is amenable to theoretical analysis. Based on an experimentally supported intuition that, on average, CBO performs a gradient descent of the squared Euclidean distance to the global minimizer, we devise a novel technique for proving the convergence to the global minimizer in mean-field law for a rich class… 

Figures from this paper

On the Global Convergence of Particle Swarm Optimization Methods

A rigorous convergence analysis for the renowned particle swarm optimization method by using tools from stochastic calculus and the analysis of partial differential equations to establish convergence to a global minimizer of a possibly nonconvex and nonsmooth objective function.

Zero-Inertia Limit: from Particle Swarm Optimization to Consensus Based Optimization

This paper is devoted to providing a rigorous derivation of CBO from PSO through the limit of zero inertia, and a quantified convergence rate is obtained as well.

Ensemble-based gradient inference for particle methods in optimization and sampling

An approach based on function evaluations and Bayesian inference to extract higher-order differential information of objective functions from a given ensemble of particles to improve established ensemble-based numerical methods for optimization and sampling such as Consensus-based optimization and Langevin-based samplers.

Leveraging Memory Effects and Gradient Information in Consensus-Based Optimization: On Global Convergence in Mean-Field Law

This paper rigorously proves that the underlying dynamics of consensus-based optimization converges to a global minimizer of the objective function in mean-field law for a vast class of functions under minimal assumptions on the initialization of the method.

On the mean‐field limit for the consensus‐based optimization

This paper is concerned with the large particle limit for the consensus‐based optimization (CBO), which was postulated in the pioneering works by Carrillo, Pinnau, Totzeck and many others. In order

References

SHOWING 1-10 OF 55 REFERENCES

Convergence of Anisotropic Consensus-Based Optimization in Mean-Field Law

By adapting a recently established proof technique, it is shown that anisotropic CBO converges globally with a dimension-independent rate for a rich class of objective functions under minimal assumptions on the initialization of the method.

Convergence of a first-order consensus-based global optimization algorithm

This paper provides a convergence analysis for the first-order CBO method in [J. A. Carrillo, S. Jin, L. Li and Y. Zhu, A consensus-based global optimization method for high dimensional machine learning problems] without resorting to the corresponding mean-field model.

Consensus-based Optimization on the Sphere II: Convergence to Global Minimizers and Machine Learning

The proof of convergence of the numerical scheme to global minimizers provided conditions of well-preparation of the initial datum is presented, which combines previous results of mean-field limit with a novel asymptotic analysis, and classical convergence results of numerical methods for SDE.

On the Global Convergence of Particle Swarm Optimization Methods

A rigorous convergence analysis for the renowned particle swarm optimization method by using tools from stochastic calculus and the analysis of partial differential equations to establish convergence to a global minimizer of a possibly nonconvex and nonsmooth objective function.

Consensus-based global optimization with personal best.

A variant of a consensus-based global optimization (CBO) method that uses personal best information in order to compute the global minimum of a non-convex, locally Lipschitz continuous function.

A consensus-based model for global optimization and its mean-field limit

We introduce a novel first-order stochastic swarm intelligence (SI) model in the spirit of consensus formation models, namely a consensus-based optimization (CBO) algorithm, which may be used for the

Convergence and error estimates for time-discrete consensus-based optimization algorithms

A simple and elementary convergence and error analysis for a general time-discrete consensus-based optimization algorithm, which includes modifications of the three discrete algorithms in Carrillo et al. (2020).

A Stochastic Consensus Method for Nonconvex Optimization on the Stiefel Manifold

A consensus-based algorithm for nonconvex optimization on the Stiefel manifold is proposed that is gradient-free, thereby applicable to a wide range of problems.

A consensus-based global optimization method for high dimensional machine learning problems

This work improves recently introduced consensus-based optimization method, proposed in [R. Pinnau, C. Totzeck, O. Tse, S. Martin], by replacing the isotropic geometric Brownian motion by the component-wise one, thus removing the dimensionality dependence of the drift rate, making the method more competitive for high dimensional optimization problems.

Consensus-based optimization on hypersurfaces: Well-posedness and mean-field limit

The well-posedness of the model is studied and its mean-field approximation for large particle limit is derived rigorously, which shows that as soon as the consensus is reached, then the stochastic component vanishes.
...