• Corpus ID: 15101846

A Frequency-Domain Encoding for Neuroevolution

@article{Koutnk2012AFE,
  title={A Frequency-Domain Encoding for Neuroevolution},
  author={Jan Koutn{\'i}k and J{\"u}rgen Schmidhuber and Faustino J. Gomez},
  journal={ArXiv},
  year={2012},
  volume={abs/1212.6521}
}
Neuroevolution has yet to scale up to complex reinforcement learning tasks that require large networks. Networks with many inputs (e.g. raw video) imply a very high dimensional search space if encoded directly. Indirect methods use a more compact genotype representation that is transformed into networks of potentially arbitrary size. In this paper, we present an indirect method where networks are encoded by a set of Fourier coefficients which are transformed into network weight matrices via an… 

Evolving large-scale neural networks for vision-based reinforcement learning

This paper scale-up their compressed network encoding where network weight matrices are represented indirectly as a set of Fourier-type coefficients, to tasks that require very-large networks due to the high-dimensionality of their input space.

Evolving large-scale neural networks for vision-based TORCS

To the authors' knowledge this is the first attempt to tackle TORCS using vision, and successfully evolve a neural network controllers of this size.

Training and Generating Neural Networks in Compressed Weight Space

The goal is to open a discussion on this topic, starting with recurrent neural networks for character-level language modelling whose weight matrices are encoded by the discrete cosine transform, and using a recurrent neural network to parameterise the compressed weights.

COMPRESSED WEIGHT SPACE

The goal is to open a discussion on this topic, starting with recurrent neural networks for character-level language modelling whose weight matrices are encoded by the discrete cosine transform, and using a recurrent neural network to parameterise the compressed weights.

References

SHOWING 1-10 OF 19 REFERENCES

Evolving neural networks in compressed weight space

A new indirect encoding scheme for neural networks in which the weight matrices are represented in the frequency domain by sets Fourier coefficients, which can dramatically reduce the search space dimensionality such that solutions can be found in significantly fewer evaluations.

Accelerated Neural Evolution through Cooperatively Coevolved Synapses

This paper compares a neuroevolution method called Cooperative Synapse Neuroevolution (CoSyNE), that uses cooperative coevolution at the level of individual synaptic weights, to a broad range of reinforcement learning algorithms on very difficult versions of the pole balancing problem that involve large state spaces and hidden state.

Evolving Neural Networks through Augmenting Topologies

A method is presented, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning task and shows how it is possible for evolution to both optimize and complexify solutions simultaneously.

Generating large-scale neural networks through discovering geometric regularities

A method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections.

Searching for Minimal Neural Networks in Fourier Space

A more natural and often more practical NEL whose instructions are frequency domain coefficients, whose weight matrix encoding greatly accelerates the search and some of the solutions turn out to be unexpectedly simple.

A common genetic encoding for both direct and indirect encodings of networks

CGE has useful properties that makes it suitable for evolving neural networks, and some of the important properties of the encoding are proven such as its closure under mutation operators, its completeness in representing any phenotype network, and the existence of an algorithm that can evaluate any given phenotype without running into an infinite loop.

Kernel representations for evolving continuous functions

Kernel methods offer a number of beneficial properties for parameterizing continuous functions, such as smoothness and locality, which make them attractive as a basis for mutation operators and show how evolutionary computation can profit from these properties.

High dimensions and heavy tails for natural evolution strategies

This work applies SNES to problems of previously unattainable dimensionality, recovering lowest-energy structures on the Lennard-Jones atom clusters, and obtaining state-of-the-art results on neuro-evolution benchmarks.

Evolving a Single Scalable Controller for an Octopus Arm with a Variable Number of Segments

This paper demonstrates how an indirectly encoded neurocontroller for a simulated octopus arm leverages regularities and domain geometry to capture underlying motion principles and sidestep the superficial trap of dimensionality.

Designing Neural Networks Using Genetic Algorithms with Graph Generation System

A graph grammatical encoding is proposed that will encode graph generation grammar to the chromosome so that it generates more regular connectivity patterns with shorter chromosome length.