• Corpus ID: 52945639

Positional Cartesian Genetic Programming

@article{Wilson2018PositionalCG,
  title={Positional Cartesian Genetic Programming},
  author={Dennis G. Wilson and Julian Francis Miller and Sylvain Cussat-Blanc and Herv{\'e} Luga},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.04119}
}
Cartesian Genetic Programming (CGP) has many modifications across a variety of implementations, such as recursive connections and node weights. Alternative genetic operators have also been proposed for CGP, but have not been fully studied. In this work, we present a new form of genetic programming based on a floating point representation. In this new form of CGP, called Positional CGP, node positions are evolved. This allows for the evaluation of many different genetic operators while allowing… 

Figures and Tables from this paper

Unimodal optimization using a genetic-programming-based method with periodic boundary conditions
This article describes a new genetic-programming-based optimization method using a multi-gene approach along with a niching strategy and periodic domain constraints. The method is referred to as
A New Deterministic Technique for Symbolic Regression
TLDR
A new method for Symbolic Regression that allows to find mathematical expressions from a dataset in a very low computational time, and can return mathematical expressions that can be easily analysed by the user, in opposition to other techniques like GSGP.

References

SHOWING 1-10 OF 24 REFERENCES
A Survey of Self Modifying Cartesian Genetic Programming
TLDR
The results of using SMCGP on a variety of different problems are discussed, and it is demonstrated how it is able to solve tasks that require scalability and plasticity.
Redundancy and computational efficiency in Cartesian genetic programming
TLDR
The results presented demonstrate the role of mutation and genotype length in the evolvability of the graph-based Cartesian genetic programming system and find that the most evolvable representations occur when the genotype is extremely large and in which over 95% of the genes are inactive.
Recurrent Cartesian Genetic Programming
TLDR
Recurrent Cartesian Genetic Programming (RCGP) is formally introduced and it is found that RCGP significantly outperforms CGP on two partially observable tasks: artificial ant and sunspot prediction.
A Comparative Study on Crossover in Cartesian Genetic Programming
TLDR
Results show that it is possible for a crossover operator to outperform the standard \((1+\lambda )\) strategy on a limited number of tasks and the question of finding a universal crossover operator in CGP remains open.
A new crossover technique for Cartesian genetic programming
TLDR
It is shown that by implementing the new crossover technique, convergence is faster than that of using mutation only in the Cartesian Genetic Programming method.
Differentiable Genetic Programming
TLDR
On several problems of increasing complexity, the use of differentiable Cartesian Genetic Programming is found to be able to find the exact form of the symbolic expression as well as the constants values.
Genetic Programming and Autoconstructive Evolution with the Push Programming Language
TLDR
This article describes Push and illustrates some of the opportunities that it presents for evolutionary computation and two evolutionary computation systems, PushGP and Pushpop, are described in detail.
Gene Regulatory Network Evolution Through Augmenting Topologies
TLDR
This algorithm, inspired by the successful neuroevolution of augmenting topologies algorithm's use in evolving neural networks and compositional pattern-producing networks, is based on a specific initialization method, a crossover operator based on gene alignment, and speciation based upon GRN structures.
Evolution of Graph-Like Programs with Parallel Distributed Genetic Programming
TLDR
The paper presents the representations, the operators and the interpreters used in PDGP, and describes experiments in which PDGP has been compared to standard GP.
Evolving Neural Networks through Augmenting Topologies
TLDR
A method is presented, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topology method on a challenging benchmark reinforcement learning task and shows how it is possible for evolution to both optimize and complexify solutions simultaneously.
...
...