A novel generative encoding for evolving modular, regular and scalable networks

@inproceedings{Suchorzewski2011ANG,
  title={A novel generative encoding for evolving modular, regular and scalable networks},
  author={Marcin Suchorzewski and Jeff Clune},
  booktitle={GECCO '11},
  year={2011}
}
In this paper we introduce the Developmental Symbolic Encoding (DSE), a new generative encoding for evolving networks (e.g. neural or boolean). DSE combines elements of two powerful generative encodings, Cellular Encoding and HyperNEAT, in order to evolve networks that are modular, regular, scale-free, and scalable. Generating networks with these properties is important because they can enhance performance and evolvability. We test DSE's ability to generate scale-free and modular networks by… 

Figures and Tables from this paper

Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

HyperNeat Plus the Connection Cost Technique

It is shown that adding the connection cost technique to Hyper NEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEat that was specifically designed to encourage modularity.

Critical factors in the performance of hyperNEAT

The hypothesis that fracture in the problem space, known to be challenging for regular NEAT, is even more so for HyperNEAT is examined, suggesting quite complex networks are needed to cope with fracture and Hyper NEAT can have difficulty discovering them.

Evolving programs to build artificial neural networks

This chapter evolves a pair of programs that build the network, one of which runs inside neurons and allows them to move, change, die or replicate, and the other is executed inside dendrites and allowing them to change length and weight, be removed, or replicate.

Modularity in NEAT Reinforcement Learning Networks

It was shown that the ideal level of network modularity in the explored parameter space is highly dependent on other network variables, dispelling theories that modularity has a straightforward relationship to network performance and demonstrating that rewarding modularity directly did not improve fitness.

IMPROBED: Multiple Problem-Solving Brain via Evolved Developmental Programs

  • J. Miller
  • Biology, Computer Science
    Artificial Life
  • 2022
A simple neural model, called IMPROBED, is described, in which two neural programs construct an artificial brain that can simultaneously solve multiple computational problems.

Evolving Developmental Programs That Build Neural Networks for Solving Multiple Problems

This model appears to be the first attempt to solve multiple standard classification problems using a developmental approach and can generate artificial neural networks that perform reasonably well on all three benchmark problems simultaneously.

Evolving multimodal behavior through modular multiobjective neuroevolution

This dissertation expands on existing neuroevolution methods, specifically NEAT (Neuro-Evolution of Augmenting Topologies [7]), to make the discovery of multiple modes of behavior possible and proposes four extensions: (1) multiobjective evolution, (2) sensors that are split up according to context, (3) modular neural network structures, and (4) fitness-based shaping.

Chapter 1 Artificial Neurogenesis : An Introduction and Selective Review

The hypothesis that adaptive growth is a means of producing brain-like machines is explored and a strong synergy, sometimes interchangeability, between developmental and epigenetic processes is shown—a topic that has remained largely under-explored in the literature.

Artificial Neurogenesis: An Introduction and Selective Review

The hypothesis that adaptive growth is a means of producing brain-like machines is explored and a strong synergy, sometimes interchangeability, between developmental and epigenetic processes is shown—a topic that has remained largely under-explored in the literature.

References

SHOWING 1-10 OF 21 REFERENCES

Evolving scalable and modular adaptive networks with Developmental Symbolic Encoding

A novel developmental encoding for networks, featuring scalability, modularity, regularity and hierarchy is proposed, which allows to represent structural regularities of networks and build them from encapsulated and possibly reused subnetworks.

On the Performance of Indirect Encoding Across the Continuum of Regularity

This paper presents the first comprehensive study showing that phenotypic regularity enables an indirect encoding to outperform direct encoding controls as problem regularity increases, and suggests a path forward that combines indirect encodings with a separate process of refinement.

Spontaneous evolution of modularity and network motifs.

  • N. KashtanU. Alon
  • Biology
    Proceedings of the National Academy of Sciences of the United States of America
  • 2005
Light is shed on the evolutionary forces that promote structural simplicity in biological networks and ways to improve the evolutionary design of engineered systems are offered.

A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks

The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.

Investigating whether hyperNEAT produces modular neural networks

The first documented case of HyperNEAT producing a modular phenotype is presented, but the inability to encourage modularity on harder problems where modularity would have been beneficial suggests that more work is needed to increase the likelihood that Hyper NEAT and similar algorithms produce modular ANNs in response to challenging, decomposable problems.

Constraining connectivity to encourage modularity in HyperNEAT

This paper investigates how altering the traditional approach to determining whether connections are expressed in HyperNEAT influences modularity, and provides an important clue to how an indirect encoding of network structure can be encouraged to evolve modularity.

A Taxonomy for Artificial Embryogeny

This taxonomy provides a unified context for long-term research in AE, so that implementation decisions can be compared and contrasted along known dimensions in the design space of embryogenic systems, and allows predicting how the settings of various AE parameters affect the capacity to efficiently evolve complex phenotypes.

Creating High-Level Components with a Generative Representation for Body-Brain Evolution

Applying GENRE to the task of evolving robots for locomotion and comparing it against a non-generative (direct) representation shows that the generative representation system rapidly produces robots with significantly greater fitness.

Evolving Neural Networks for Visual Processing

positive results were achieved for non-trivial tasks, and some important characteristics of HyperNEAT not previously reported were discovered: a bias towards generating weight patterns aligned with the topographical arrangement of neurons within the layers of the substrate network; the ability to evolve solutions more quickly with larger substrate networks; and a possible difficulty co-evolving weight patterns for substrate networks containing more than one hidden layer.

Generative representations for the automated design of modular physical robots

This work demonstrates an automatic design system that produces complex robots by exploiting the principles of regularity, modularity, hierarchy, and reuse, and demonstrates for the first time the evolution and construction of modular, three-dimensional, physically locomoting robots.