On the Performance of Indirect Encoding Across the Continuum of Regularity

@article{Clune2011OnTP,
  title={On the Performance of Indirect Encoding Across the Continuum of Regularity},
  author={Jeff Clune and Kenneth O. Stanley and Robert T. Pennock and Charles Ofria},
  journal={IEEE Transactions on Evolutionary Computation},
  year={2011},
  volume={15},
  pages={346-367}
}
This paper investigates how an evolutionary algorithm with an indirect encoding exploits the property of phenotypic regularity, an important design principle found in natural organisms and engineered designs. We present the first comprehensive study showing that such phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases. Such an ability to produce regular solutions that can exploit the regularity of problems is an important… 

Evolving artificial neural networks with generative encodings inspired by developmental biology

The general conclusion that can be drawn from this work is that generative encodings can produce some of the properties seen in complex, natural organisms, and will likely be an important part of the long-term goal of synthetically evolving phenotypes that approach the capability, intelligence, and complexity of their natural rivals.

Improving HybrID: How to best combine indirect and direct encoding in evolutionary algorithms

Two new methods to improve HybrID are tested by eliminating the need to manually specify when to switch from indirect to direct encoding and suggesting a path forward for automatically and simultaneously combining the best traits of indirect and direct encoding.

Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

HyperNeat Plus the Connection Cost Technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

A novel generative encoding for evolving modular, regular and scalable networks

DSE significantly outperforms HyperNEAT on a pattern recognition problem, suggesting that its potential lay not just in the properties of the networks it produces, but also because it can compete with leading encodings at solving challenging problems.

Does Aligning Phenotypic and Genotypic Modularity Improve the Evolution of Neural Networks?

Results suggest encouraging modularity in both the genotype and phenotype as an important step towards solving large-scale multi-modal problems, but also indicate that more research is required before the authors can evolve structurally organized networks to solve tasks that require multiple, different neural modules.

On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks

The results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks, and reveal the consequence of the bias of developmental encodings towards regular structures.

Extrapolation of regularity using indirect encodings

  • B. E. Eskridge
  • Psychology
    2011 IEEE Congress of Evolutionary Computation (CEC)
  • 2011
Results show that an indirect encoding is able to extrapolate performance in one area of a problem's state space to a new area in which it has no experience with little to no loss of performance, depending on the regularities of the problem'sstate space.

Critical factors in the performance of hyperNEAT

The hypothesis that fracture in the problem space, known to be challenging for regular NEAT, is even more so for HyperNEAT is examined, suggesting quite complex networks are needed to cope with fracture and Hyper NEAT can have difficulty discovering them.

Comparing the Evolvability of Generative Encoding Schemes

A novel approach to measure the evolvability provided by an encoding is reported, by characterizing both the quality of the mutations and the quantity of phenotypic variation, and the number of generators required by amputated individuals to recover an effective gait is indicated.
...

References

SHOWING 1-10 OF 59 REFERENCES

HybrID: A Hybridization of Indirect and Direct Encodings for Evolutionary Computation

A new algorithm is proposed, a Hybridized Indirect and Direct encoding (HybrID), which discovers the regularity of a problem with an indirect encoding and accounts for irregularities via a direct encoding, suggesting that hybridizing indirect and direct encodings can be an effective way to improve the performance of evolutionary algorithms.

How a Generative Encoding Fares as Problem-Regularity Decreases

As the regularity of the problem decreases, the performance of the generative representation degrades to, and then underperforms, the direct encoding, yet tends to be consistent for different types of problem regularity.

Investigating whether hyperNEAT produces modular neural networks

The first documented case of HyperNEAT producing a modular phenotype is presented, but the inability to encourage modularity on harder problems where modularity would have been beneficial suggests that more work is needed to increase the likelihood that Hyper NEAT and similar algorithms produce modular ANNs in response to challenging, decomposable problems.

Acquiring evolvability through adaptive representations

Three neural network representations, a direct encoding, a complexifying encoding, and an implicit encoding capable of adapting the genotype-phenotype mapping are compared on Nothello, acomplex game playing domain from the AAAI General Game Playing Competition.

A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

Compositional pattern producing networks: A novel abstraction of development

Results produced with CPPNs through interactive evolution of two-dimensional images show that such an encoding can nevertheless produce structural motifs often attributed to more conventional developmental abstractions, suggesting that local interaction may not be essential to the desirable properties of natural encoding in the way that is usually assumed.

Generating large-scale neural networks through discovering geometric regularities

A method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections.

Evolving coordinated quadruped gaits with the HyperNEAT generative encoding

It is demonstrated that HyperNEAT, a new and promising generative encoding for evolving neural networks, can evolve quadruped gaits without an engineer manually decomposing the problem.

Autonomous Evolution of Topographic Regularities in Artificial Neural Networks

This letter shows that when geometry is introduced to evolved ANNs through the hypercube-based neuroevolution of augmenting topologies algorithm, they begin to acquire characteristics that indeed are reminiscent of biological brains.

Indirectly Encoding Neural Plasticity as a Pattern of Local Rules

This paper aims to show that learning rules can be effectively indirectly encoded by extending the Hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) method to evolve large-scale adaptive ANNs, which is a major goal for neuroevolution.
...