Derivative-Free Optimization of High-Dimensional Non-Convex Functions by Sequential Random Embeddings

  title={Derivative-Free Optimization of High-Dimensional Non-Convex Functions by Sequential Random Embeddings},
  author={Hong Qian and Yi-Qi Hu and Yang Yu},
Derivative-free optimization methods are suitable for sophisticated optimization problems, while are hard to scale to high dimensionality (e.g., larger than 1,000). Previously, the random embedding technique has been shown successful for solving high-dimensional problems with low effective dimensions. However, it is unrealistic to assume a low effective dimension in many applications. This paper turns to study high-dimensional problems with low optimal "-effective dimensions, which allow all… CONTINUE READING


Publications referenced by this paper.
Showing 1-10 of 32 references

In Proceedings of the 23rd International Conference on Machine Learning

  • Ronan Collobert, Fabian Sinz, Jason Weston, Léon Bottou. Trading convexity for scalability
  • pages 201–208, Pittsburgh, Pennsylvania,
  • 2006
Highly Influential
3 Excerpts

In Proceedings of the 30th AAAI Conference on Artificial Intelligence

  • Hong Qian, Yang Yu. Scaling simultaneous optimistic optimization f dimensions
  • Phoenix, AZ,
  • 2016

In Proceedings of the 24th International Joint Conference on Artificial Intelligence

  • Abram L. Friesen, Pedro M. Domingos. Recursive decomposition for nonconvex optimization
  • pages 253–259,
  • 2015
1 Excerpt

In Proceedings of the 32nd International Conference on Machine Learning

  • Kirthevasan Kandasamy, Jeff Schneider, Barnabás Póczos. High dimensional bayesian optimisation, bandits via additive models
  • pages 295–304, Lille, France,
  • 2015
2 Excerpts

Similar Papers

Loading similar papers…