A linear time natural evolution strategy for non-separable functions

@article{Sun2013ALT,
  title={A linear time natural evolution strategy for non-separable functions},
  author={Yi Sun and Faustino J. Gomez and Tom Schaul and Juergen Schmidhuber},
  journal={Proceedings of the 15th annual conference companion on Genetic and evolutionary computation},
  year={2013}
}
We present a novel Natural Evolution Strategy (NES) variant, the Rank-One NES (R1-NES), which uses a low-rank approximation of the search distribution covariance matrix. The algorithm allows computation of the natural gradient with cost linear in the dimensionality of the parameter space, and excels in solving high-dimensional non-separable problems. 

Figures from this paper

Fast Moving Natural Evolution Strategy for High-Dimensional Problems
TLDR
The proposed method, CR-FM-NES, extends a recently proposed state-of-the-art NES, Fast Moving Natural Evolution Strategy, in order to be applicable in high- dimensional problems, and builds on an idea using a restricted representation of a covariance matrix instead of using a full covariance Matrix.
A Simple Yet Efficient Evolution Strategy for Large-Scale Black-Box Optimization
TLDR
An evolution strategy algorithm using a sparse plus low rank model for large-scale optimization in this paper develops a rank one evolution strategy using a single principal search direction, and investigates the effects of Hessian on the algorithm performance.
Comparison-based natural gradient optimization in high dimension
TLDR
A novel natural gradient based stochastic search algorithm, VD-CMA, for the optimization of high dimensional numerical functions that outperforms CMA-ES not only in internal complexity but also in number of function calls with increasing dimension.
Large-Scale Evolution Strategy Based on Search Direction Adaptation
TLDR
SDA-ES models the covariance matrix with an identity matrix and multiple search directions, and uses a heuristic to update the search directions in a way similar to the principal component analysis, which generalizes the traditional 1/5th success rule to adapt the mutation strength.
A More Efficient Rank-one Covariance Matrix Update for Evolution Strategies
TLDR
An algorithm from numerical mathematics for rank-one updates of Cholesky factors is adopted, resulting in a quadratic time covariance matrix update scheme with minimal memory requirements and numerical stability and runtime improvements.
Tractable structured natural gradient descent using local parameterizations
TLDR
This work generalizes the exponential natural evolutionary strategy, recovers existing Newton-like algorithms, yields new structured second-order algorithms, and gives new algorithms to learn covariances of Gaussian and Wishart-based distributions.
Fast Covariance Matrix Adaptation for Large-Scale Black-Box Optimization
TLDR
This paper proposes a fast variant of CMA-ES (Fast C MA-ES) to handle large-scale black-box optimization problems and illustrates that the covariance matrix of the underlying distribution can be considered as an ensemble of simple models constructed by two vectors.
Challenges in High-dimensional Reinforcement Learning with Evolution Strategies
TLDR
It is shown that combining ESs that offer reduced internal algorithm cost with uncertainty handling techniques yields promising methods for this class of problems, and it is revealed that principled limitations of the approach are revealed.
Limited-Memory Matrix Adaptation for Large Scale Black-box Optimization
TLDR
The Limited-Memory Matrix Adaptation Evolution Strategy (LM-MA-ES) is presented, demonstrating state-of-the-art performance on a set of established large-scale benchmarks and exploring the algorithm on the problem of generating adversarial inputs for a (non-smooth) random forest classifier, demonstrating a surprising vulnerability of the classifier.
...
...

References

SHOWING 1-10 OF 19 REFERENCES
Exponential natural evolution strategies
TLDR
The new algorithm, exponential NES (xNES), is significantly simpler than its predecessors and is more principled than CMA-ES, as all the update rules needed for covariance matrix adaptation are derived from a single principle.
High dimensions and heavy tails for natural evolution strategies
TLDR
This work applies SNES to problems of previously unattainable dimensionality, recovering lowest-energy structures on the Lennard-Jones atom clusters, and obtaining state-of-the-art results on neuro-evolution benchmarks.
Stochastic search using the natural gradient
TLDR
The Natural Gradient is used to update the distribution's parameters in the direction of higher expected fitness, by efficiently calculating the inverse of the exact Fisher information matrix whereas previous methods had to use approximations.
Natural Evolution Strategies
TLDR
NES is presented, a novel algorithm for performing real-valued dasiablack boxpsila function optimization: optimizing an unknown objective function where algorithm-selected function measurements constitute the only information accessible to the method.
Bidirectional Relation between CMA Evolution Strategies and Natural Evolution Strategies
TLDR
This paper derives the explicit form of the natural gradient of the expected fitness and transforms it into the forms corresponding to the mean vector and the covariance matrix of the mutation distribution to show that the natural evolution strategy can be viewed as a variant of covariance Matrix adaptation evolution strategies using Cholesky update.
A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity
TLDR
This paper proposes a simple modification of the Covariance Matrix Adaptation Evolution Strategy, reducing the internal time and space complexity from quadratic to linear, and the resulting algorithm, sep-CMA-ES, samples each coordinate independently.
Natural Gradient Works Efficiently in Learning
  • S. Amari
  • Computer Science
    Neural Computation
  • 1998
TLDR
The dynamical behavior of natural gradient online learning is analyzed and is proved to be Fisher efficient, implying that it has asymptotically the same performance as the optimal batch estimation of parameters.
Completely Derandomized Self-Adaptation in Evolution Strategies
TLDR
This paper puts forward two useful methods for self-adaptation of the mutation distribution - the concepts of derandomization and cumulation and reveals local and global search properties of the evolution strategy with and without covariance matrix adaptation.
Three dimensional evolutionary aerodynamic design optimization with CMA-ES
TLDR
The application of evolutionary optimization methods to a demanding, industrially relevant engineering domain, the three-dimensional optimization of gas turbine stator blades, successfully optimizes the aerodynamic design.
A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion
TLDR
It is demonstrated that their online optimization with the proposed methodology enhances, in an automated fashion, the online performance of the controllers, even under highly unsteady operating conditions, and it also compensates for uncertainties in the model-building and design process.
...
...