Investigating whether hyperNEAT produces modular neural networks

@inproceedings{Clune2010InvestigatingWH,
  title={Investigating whether hyperNEAT produces modular neural networks},
  author={Jeff Clune and Benjamin E. Beckmann and Philip K. McKinley and Charles Ofria},
  booktitle={GECCO '10},
  year={2010}
}
HyperNEAT represents a class of neuroevolutionary algorithms that captures some of the power of natural development with a computationally efficient high-level abstraction of development. This class of algorithms is intended to provide many of the desirable properties produced in biological phenotypes by natural developmental processes, such as regularity, modularity and hierarchy. While it has been previously shown that HyperNEAT produces regular artificial neural network (ANN) phenotypes, in… 

Figures from this paper

Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

HyperNeat Plus the Connection Cost Technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

The modularity in freeform evolving neural networks

This paper validate whether the network modularity can emerge, and the evolution performance can be improved by varying the environment or evolution process under a more freeform artificial evolution, and an artificial tracer method was employed to quantify the modularity.

Critical factors in the performance of hyperNEAT

The hypothesis that fracture in the problem space, known to be challenging for regular NEAT, is even more so for HyperNEAT is examined, suggesting quite complex networks are needed to cope with fracture and Hyper NEAT can have difficulty discovering them.

Modularity in NEAT Reinforcement Learning Networks

It was shown that the ideal level of network modularity in the explored parameter space is highly dependent on other network variables, dispelling theories that modularity has a straightforward relationship to network performance and demonstrating that rewarding modularity directly did not improve fitness.

Evolving scalable and modular adaptive networks with Developmental Symbolic Encoding

A novel developmental encoding for networks, featuring scalability, modularity, regularity and hierarchy is proposed, which allows to represent structural regularities of networks and build them from encapsulated and possibly reused subnetworks.

Constraining connectivity to encourage modularity in HyperNEAT

This paper investigates how altering the traditional approach to determining whether connections are expressed in HyperNEAT influences modularity, and provides an important clue to how an indirect encoding of network structure can be encouraged to evolve modularity.

Towards Evolving More Brain-Like Artificial Neural Networks

The combined approach, adaptive ES-HyperNEAT, unifies for the first time in neuroevolution the abilities to indirectly encode connectivity through geometry, generate patterns of heterogeneous plasticity, and simultaneously encode the density and placement of nodes in space.

On the Performance of Indirect Encoding Across the Continuum of Regularity

This paper presents the first comprehensive study showing that phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases, and suggests a path forward that combines indirect encodings with a separate process of refinement.

Guiding Neuroevolution with Structural Objectives

This work proposes two new structural objectives and tests their ability to guide evolving neural networks on two problems which can benefit from decomposition into subtasks and finds that both methods perform well on a problem with a very clear and decomposable structure.
...

References

SHOWING 1-10 OF 28 REFERENCES

A Hypercube-Based Indirect Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

Evolving Neural Networks through Augmenting Topologies

A method is presented, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning task and shows how it is possible for evolution to both optimize and complexify solutions simultaneously.

Automatic Definition of Modular Neural Networks

  • F. Gruau
  • Computer Science
    Adapt. Behav.
  • 1994
An artificial developmental system that is a computationally efficient technique for the automatic generation of complex artificial neural networks (ANNs) and some simulation results showing that the same problem cannot be solved if the mechanism for automatic definition of subnetworks is suppressed.

A Taxonomy for Artificial Embryogeny

This taxonomy provides a unified context for long-term research in AE, so that implementation decisions can be compared and contrasted along known dimensions in the design space of embryogenic systems, and allows predicting how the settings of various AE parameters affect the capacity to efficiently evolve complex phenotypes.

Compositional pattern producing networks: A novel abstraction of development

Results produced with CPPNs through interactive evolution of two-dimensional images show that such an encoding can nevertheless produce structural motifs often attributed to more conventional developmental abstractions, suggesting that local interaction may not be essential to the desirable properties of natural encoding in the way that is usually assumed.

Evolving coordinated quadruped gaits with the HyperNEAT generative encoding

It is demonstrated that HyperNEAT, a new and promising generative encoding for evolving neural networks, can evolve quadruped gaits without an engineer manually decomposing the problem.

The sensitivity of HyperNEAT to different geometric representations of a problem

The results suggest that HyperNEAT practitioners can obtain good results even if they do not know how to geometrically represent a problem, and that further improvements are possible with a well-chosen geometric representation.

Spontaneous evolution of modularity and network motifs.

  • N. KashtanU. Alon
  • Biology
    Proceedings of the National Academy of Sciences of the United States of America
  • 2005
Light is shed on the evolutionary forces that promote structural simplicity in biological networks and ways to improve the evolutionary design of engineered systems are offered.

Facilitated Variation: How Evolution Learns from Past Environments To Generalize to New Environments

This work addresses the question of how FV spontaneously emerges by means of computer simulations of two well-studied model systems, logic circuits and RNA secondary structure, and finds that evolution of FV is enhanced in environments that change from time to time in a systematic way.