• Corpus ID: 15625

Escaping the Local Minima via Simulated Annealing: Optimization of Approximately Convex Functions

@inproceedings{Belloni2015EscapingTL,
  title={Escaping the Local Minima via Simulated Annealing: Optimization of Approximately Convex Functions},
  author={Alexandre Belloni and Tengyuan Liang and Hariharan Narayanan and Alexander Rakhlin},
  booktitle={COLT},
  year={2015}
}
We consider the problem of optimizing an approximately convex function over a bounded convex set in $\mathbb{R}^n$ using only function evaluations. The problem is reduced to sampling from an \emph{approximately} log-concave distribution using the Hit-and-Run method, which is shown to have the same $\mathcal{O}^*$ complexity as sampling from log-concave distributions. In addition to extend the analysis for log-concave distributions to approximate log-concave distributions, the implementation of… 
Convex Optimization with Unbounded Nonconvex Oracles using Simulated Annealing
TLDR
This paper studies the more general case when the noise has magnitude $\alpha F(x) + \beta$ for some $\alpha, \beta > 0$ and presents a polynomial time algorithm that finds an approximate minimizer of $F$ for this noise model.
Convex Optimization with Nonconvex Oracles
TLDR
This result allows for unbounded noise and generalizes those of Applegate and Kannan, and Zhang, Liang and Charikar, who proved similar results for the bounded noise case, and that of Belloni et al. who assume that the noise grows in a very specific manner.
Statistical Query Algorithms for Stochastic Convex Optimization
TLDR
It is shown that well-known and popular methods, including first-order iterative methods and polynomial-time methods, can be implemented using only statistical queries, and nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting.
Algorithms for Stochastic Convex Optimization
TLDR
It is shown that well-known and popular methods, including first-order iterative methods and polynomial-time methods, can be implemented using only statistical queries and derived nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting.
Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization
TLDR
This work studies the complexity of stochastic convex optimization given only statistical query access to the objective function, and derives nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting.
Generalization of ERM in Stochastic Convex Optimization: The Dimension Strikes Back
TLDR
The question of how many samples are necessary for ERM to succeed and the closely related question of uniform convergence of $F_S$ to $F$ over $\cal K$ are considered and the lower bound applies even if the functions in the support of $D$ are smooth and efficiently computable and even if an $\ell_1$ regularization term is added.
Submodular Function Minimization with Noisy Evaluation Oracle
  • Shinji Ito
  • Computer Science, Mathematics
    NeurIPS
  • 2019
TLDR
An algorithm is provided with an algorithm with an-additive error bound as well as a worst-case analysis including a lower bound of $\Omega(n/\sqrt{T})$, which together imply that the algorithm achieves an optimal error bound up to a constant.
Algorithms and matching lower bounds for approximately-convex optimization
TLDR
This paper significantly improves the known lower bound on Δ as a function of e and an algorithm matching this lower bound for a natural class of convex bodies and proves an information theoretic lower bound that any algorithm that outputs such a x must use super polynomial number of evaluations of f.
Maximization of Approximately Submodular Functions
TLDR
The query-complexity of maximizing F, a function that is approximately submodular under a cardinality constraint, is characterized as a function of the error level $\eps > 0$ and both lower and upper bounds are provided.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 31 REFERENCES
Simulated Annealing for Convex Optimization
TLDR
One of the advantages of simulated annealing, in addition to avoiding poor local minima, is that in these problems it converges faster to the minima that it finds, and it is concluded that under certain general conditions, the Boltzmann-Gibbs distributions are optimal on these convex problems.
Sampling and integration of near log-concave functions
TLDR
This work provides the first polynomial time algorithm to generate samples from a given log-concave distribution and proves a general isoperimetric inequality for convex sets and uses this together with recent developments in the theory of rapidly mixing Markov chains.
On Zeroth-Order Stochastic Convex Optimization via Random Walks
TLDR
The randomized approach circumvents the problem of gradient estimation, and appears to be less sensitive to noisy function evaluations compared to noiseless zeroth order methods.
A simple randomised algorithm for convex optimisation
TLDR
Under smoothness conditions on the function and the feasible set, the algorithm computes a near-optimal point in a number of operations which is bounded by a polynomial function of all relevant input parameters and the reciprocal of the desired precision, with high probability.
The geometry of logconcave functions and sampling algorithms
TLDR
These results are applied to analyze two efficient algorithms for sampling from a logconcave distribution in n dimensions, with no assumptions on the local smoothness of the density function.
On the Complexity of Bandit and Derivative-Free Stochastic Convex Optimization
  • O. Shamir
  • Computer Science, Mathematics
    COLT
  • 2013
TLDR
The attainable error/regret in the bandit and derivative-free settings, as a function of the dimension d and the available number of queries T is investigated, and a precise characterization of the attainable performance for strongly-convex and smooth functions is provided.
On the Computational Complexity of MCMC-based Estimators in Large Samples
In this paper we examine the implications of the statistical large sample theory for the computational complexity of Bayesian and quasi-Bayesian estimation carried out using Metropolis random walks.
Random Walks in a Convex Body and an Improved Volume Algorithm
TLDR
A randomized algorithm using O(n7 log’ n) separation calls to approximate the volume of a convex body with a fixed relative error is given and the mixing rate of Markov chains from finite to arbitrary Markov Chains is analyzed.
The local stability of convexity, affinity and of the Jensen equation
Summary. Let $ C_D, A_D, J_D $ denote the smallest constants involved in the stability of convexity, affinity and of the Jensen equation of functions defined on a convex subset D of $ {\Bbb R}^n $.
Solving convex programs by random walks
TLDR
A simple new algorithm for convex optimization based on sampling by a random walk is presented, which solves for a natural generalization of the problem.
...
1
2
3
4
...