On the Performance of Indirect Encoding Across the Continuum of Regularity
@article{Clune2011OnTP, title={On the Performance of Indirect Encoding Across the Continuum of Regularity}, author={Jeff Clune and Kenneth O. Stanley and Robert T. Pennock and Charles Ofria}, journal={IEEE Transactions on Evolutionary Computation}, year={2011}, volume={15}, pages={346-367} }
This paper investigates how an evolutionary algorithm with an indirect encoding exploits the property of phenotypic regularity, an important design principle found in natural organisms and engineered designs. We present the first comprehensive study showing that such phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases. Such an ability to produce regular solutions that can exploit the regularity of problems is an important…
Figures from this paper
155 Citations
Evolving artificial neural networks with generative encodings inspired by developmental biology
- Computer Science
- 2010
The general conclusion that can be drawn from this work is that generative encodings can produce some of the properties seen in complex, natural organisms, and will likely be an important part of the long-term goal of synthetically evolving phenotypes that approach the capability, intelligence, and complexity of their natural rivals.
Improving HybrID: How to best combine indirect and direct encoding in evolutionary algorithms
- Computer SciencePloS one
- 2017
Two new methods to improve HybrID are tested by eliminating the need to manually specify when to switch from indirect to direct encoding and suggesting a path forward for automatically and simultaneously combining the best traits of indirect and direct encoding.
Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique
- Computer ScienceGECCO
- 2014
It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.
A novel generative encoding for evolving modular, regular and scalable networks
- Computer ScienceGECCO '11
- 2011
DSE significantly outperforms HyperNEAT on a pattern recognition problem, suggesting that its potential lay not just in the properties of the networks it produces, but also because it can compete with leading encodings at solving challenging problems.
Does Aligning Phenotypic and Genotypic Modularity Improve the Evolution of Neural Networks?
- BiologyGECCO
- 2016
Results suggest encouraging modularity in both the genotype and phenotype as an important step towards solving large-scale multi-modal problems, but also indicate that more research is required before the authors can evolve structurally organized networks to solve tasks that require multiple, different neural modules.
On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks
- Computer SciencePloS one
- 2013
The results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks, and reveal the consequence of the bias of developmental encodings towards regular structures.
Extrapolation of regularity using indirect encodings
- Psychology2011 IEEE Congress of Evolutionary Computation (CEC)
- 2011
Results show that an indirect encoding is able to extrapolate performance in one area of a problem's state space to a new area in which it has no experience with little to no loss of performance, depending on the regularities of the problem'sstate space.
Critical factors in the performance of hyperNEAT
- Computer ScienceGECCO '13
- 2013
The hypothesis that fracture in the problem space, known to be challenging for regular NEAT, is even more so for HyperNEAT is examined, suggesting quite complex networks are needed to cope with fracture and Hyper NEAT can have difficulty discovering them.
Comparing the Evolvability of Generative Encoding Schemes
- BiologyALIFE
- 2014
A novel approach to measure the evolvability provided by an encoding is reported, by characterizing both the quality of the mutations and the quantity of phenotypic variation, and the number of generators required by amputated individuals to recover an effective gait is indicated.
An Enhanced Hypercube-Based Encoding for Evolving the Placement, Density, and Connectivity of Neurons
- BiologyArtificial Life
- 2012
ES-HyperNEAT significantly expands the scope of neural structures that evolution can discover by automatically deducing the node geometry from implicit information in the pattern of weights encoded by HyperNEAT, thereby avoiding the need to evolve explicit placement.
References
SHOWING 1-10 OF 59 REFERENCES
HybrID: A Hybridization of Indirect and Direct Encodings for Evolutionary Computation
- Computer ScienceECAL
- 2009
A new algorithm is proposed, a Hybridized Indirect and Direct encoding (HybrID), which discovers the regularity of a problem with an indirect encoding and accounts for irregularities via a direct encoding, suggesting that hybridizing indirect and direct encodings can be an effective way to improve the performance of evolutionary algorithms.
How a Generative Encoding Fares as Problem-Regularity Decreases
- Computer SciencePPSN
- 2008
As the regularity of the problem decreases, the performance of the generative representation degrades to, and then underperforms, the direct encoding, yet tends to be consistent for different types of problem regularity.
Investigating whether hyperNEAT produces modular neural networks
- Computer ScienceGECCO '10
- 2010
The first documented case of HyperNEAT producing a modular phenotype is presented, but the inability to encourage modularity on harder problems where modularity would have been beneficial suggests that more work is needed to increase the likelihood that Hyper NEAT and similar algorithms produce modular ANNs in response to challenging, decomposable problems.
Acquiring evolvability through adaptive representations
- Biology, Computer ScienceGECCO '07
- 2007
Three neural network representations, a direct encoding, a complexifying encoding, and an implicit encoding capable of adapting the genotype-phenotype mapping are compared on Nothello, acomplex game playing domain from the AAAI General Game Playing Competition.
A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks
- Biology, Computer ScienceArtificial Life
- 2009
The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.
Generating large-scale neural networks through discovering geometric regularities
- Computer Science, BiologyGECCO '07
- 2007
A method, called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT), which evolves a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) to discover geometric regularities in the task domain, allowing the solution to both generalize and scale without loss of function to an ANN of over eight million connections.
Evolving coordinated quadruped gaits with the HyperNEAT generative encoding
- Computer Science2009 IEEE Congress on Evolutionary Computation
- 2009
It is demonstrated that HyperNEAT, a new and promising generative encoding for evolving neural networks, can evolve quadruped gaits without an engineer manually decomposing the problem.
Autonomous Evolution of Topographic Regularities in Artificial Neural Networks
- BiologyNeural Computation
- 2010
This letter shows that when geometry is introduced to evolved ANNs through the hypercube-based neuroevolution of augmenting topologies algorithm, they begin to acquire characteristics that indeed are reminiscent of biological brains.
Indirectly Encoding Neural Plasticity as a Pattern of Local Rules
- Computer Science, BiologySAB
- 2010
This paper aims to show that learning rules can be effectively indirectly encoded by extending the Hypercube-based NeuroEvolution of Augmenting Topologies (HyperNEAT) method to evolve large-scale adaptive ANNs, which is a major goal for neuroevolution.
The sensitivity of HyperNEAT to different geometric representations of a problem
- MathematicsGECCO
- 2009
The results suggest that HyperNEAT practitioners can obtain good results even if they do not know how to geometrically represent a problem, and that further improvements are possible with a well-chosen geometric representation.