# Escaping the Local Minima via Simulated Annealing: Optimization of Approximately Convex Functions

@inproceedings{Belloni2015EscapingTL, title={Escaping the Local Minima via Simulated Annealing: Optimization of Approximately Convex Functions}, author={Alexandre Belloni and Tengyuan Liang and Hariharan Narayanan and Alexander Rakhlin}, booktitle={COLT}, year={2015} }

We consider the problem of optimizing an approximately convex function over a bounded convex set in $\mathbb{R}^n$ using only function evaluations. The problem is reduced to sampling from an \emph{approximately} log-concave distribution using the Hit-and-Run method, which is shown to have the same $\mathcal{O}^*$ complexity as sampling from log-concave distributions. In addition to extend the analysis for log-concave distributions to approximate log-concave distributions, the implementation of…

## 68 Citations

Convex Optimization with Unbounded Nonconvex Oracles using Simulated Annealing

- Computer ScienceCOLT
- 2018

This paper studies the more general case when the noise has magnitude $\alpha F(x) + \beta$ for some $\alpha, \beta > 0$ and presents a polynomial time algorithm that finds an approximate minimizer of $F$ for this noise model.

Convex Optimization with Nonconvex Oracles

- Computer ScienceArXiv
- 2017

This result allows for unbounded noise and generalizes those of Applegate and Kannan, and Zhang, Liang and Charikar, who proved similar results for the bounded noise case, and that of Belloni et al. who assume that the noise grows in a very specific manner.

Statistical Query Algorithms for Stochastic Convex Optimization

- Computer ScienceArXiv
- 2015

It is shown that well-known and popular methods, including first-order iterative methods and polynomial-time methods, can be implemented using only statistical queries, and nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting.

Algorithms for Stochastic Convex Optimization

- Computer Science
- 2015

It is shown that well-known and popular methods, including first-order iterative methods and polynomial-time methods, can be implemented using only statistical queries and derived nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting.

Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization

- Computer ScienceSODA
- 2017

This work studies the complexity of stochastic convex optimization given only statistical query access to the objective function, and derives nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting.

Generalization of ERM in Stochastic Convex Optimization: The Dimension Strikes Back

- Computer Science, MathematicsNIPS
- 2016

The question of how many samples are necessary for ERM to succeed and the closely related question of uniform convergence of $F_S$ to $F$ over $\cal K$ are considered and the lower bound applies even if the functions in the support of $D$ are smooth and efficiently computable and even if an $\ell_1$ regularization term is added.

Submodular Function Minimization with Noisy Evaluation Oracle

- Computer Science, MathematicsNeurIPS
- 2019

An algorithm is provided with an algorithm with an-additive error bound as well as a worst-case analysis including a lower bound of $\Omega(n/\sqrt{T})$, which together imply that the algorithm achieves an optimal error bound up to a constant.

Algorithms and matching lower bounds for approximately-convex optimization

- Computer Science, MathematicsNIPS
- 2016

This paper significantly improves the known lower bound on Δ as a function of e and an algorithm matching this lower bound for a natural class of convex bodies and proves an information theoretic lower bound that any algorithm that outputs such a x must use super polynomial number of evaluations of f.

Maximization of Approximately Submodular Functions

- MathematicsNIPS
- 2016

The query-complexity of maximizing F, a function that is approximately submodular under a cardinality constraint, is characterized as a function of the error level $\eps > 0$ and both lower and upper bounds are provided.

## References

SHOWING 1-10 OF 31 REFERENCES

Simulated Annealing for Convex Optimization

- Computer Science, MathematicsMath. Oper. Res.
- 2006

One of the advantages of simulated annealing, in addition to avoiding poor local minima, is that in these problems it converges faster to the minima that it finds, and it is concluded that under certain general conditions, the Boltzmann-Gibbs distributions are optimal on these convex problems.

Sampling and integration of near log-concave functions

- Computer ScienceSTOC '91
- 1991

This work provides the first polynomial time algorithm to generate samples from a given log-concave distribution and proves a general isoperimetric inequality for convex sets and uses this together with recent developments in the theory of rapidly mixing Markov chains.

On Zeroth-Order Stochastic Convex Optimization via Random Walks

- Computer Science, MathematicsArXiv
- 2014

The randomized approach circumvents the problem of gradient estimation, and appears to be less sensitive to noisy function evaluations compared to noiseless zeroth order methods.

A simple randomised algorithm for convex optimisation

- Mathematics, Computer ScienceMath. Program.
- 2014

Under smoothness conditions on the function and the feasible set, the algorithm computes a near-optimal point in a number of operations which is bounded by a polynomial function of all relevant input parameters and the reciprocal of the desired precision, with high probability.

The geometry of logconcave functions and sampling algorithms

- Computer Science, Mathematics
- 2007

These results are applied to analyze two efficient algorithms for sampling from a logconcave distribution in n dimensions, with no assumptions on the local smoothness of the density function.

On the Complexity of Bandit and Derivative-Free Stochastic Convex Optimization

- Computer Science, MathematicsCOLT
- 2013

The attainable error/regret in the bandit and derivative-free settings, as a function of the dimension d and the available number of queries T is investigated, and a precise characterization of the attainable performance for strongly-convex and smooth functions is provided.

On the Computational Complexity of MCMC-based Estimators in Large Samples

- Mathematics
- 2007

In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks.…

Random Walks in a Convex Body and an Improved Volume Algorithm

- MathematicsRandom Struct. Algorithms
- 1993

A randomized algorithm using O(n7 log’ n) separation calls to approximate the volume of a convex body with a fixed relative error is given and the mixing rate of Markov chains from finite to arbitrary Markov Chains is analyzed.

The local stability of convexity, affinity and of the Jensen equation

- Mathematics
- 1999

Summary. Let
$ C_D, A_D, J_D $ denote the smallest constants involved in the stability of convexity, affinity and of the Jensen equation of functions defined on a convex subset D of
$ {\Bbb R}^n $.…

Solving convex programs by random walks

- MathematicsSTOC '02
- 2002

A simple new algorithm for convex optimization based on sampling by a random walk is presented, which solves for a natural generalization of the problem.