Growing cell structures--A self-organizing network for unsupervised and supervised learning

@article{Fritzke1994GrowingCS,
  title={Growing cell structures--A self-organizing network for unsupervised and supervised learning},
  author={Bernd Fritzke},
  journal={Neural Networks},
  year={1994},
  volume={7},
  pages={1441-1460}
}
  • Bernd Fritzke
  • Published 1 November 1994
  • Computer Science
  • Neural Networks

The Evolving Tree—A Novel Self-Organizing Network for Data Analysis

TLDR
A new variant of the Self-Organizing Map called the Evolving Tree is proposed which tries to combine both of the advantages of the SOM, especially the time-consuming search for the best-matching unit in large maps.

Self-Organizing Networks for Nonparametric Regression

TLDR
This paper describes SOM and CTM methods in the general framework of adaptive methods for regression that effectively combine iterative computation and local regularization to achieve robust performance and modeling flexibility.

Dynamic Cell Structures

TLDR
Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the Growing Cell Structure algorithm leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.

Optimal method for growth in dynamic self organizing learning systems

TLDR
Current methods for growing the number nodes in the case of the Dynamic Cell Structures neural network are described and a new algorithm is provided that overcomes the observed flaw and enables these learning systems to grow and operate in an optimal and robust manner.

Dynamic Cell Structure Learns Perfectly Topology Preserving Map

TLDR
Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the growing cell structure algorithm leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.

Soft Competitive Learning and Growing Self-Organizing Neural Networks for Pattern Classification

TLDR
An introduction to KSOM and neural gas network, some GSONN without fixed dimensionality such as growing neuralGas and the author's model: twin growing neural gas and it's application for pattern classification are discussed.

A survey of some classic self-organizing maps with incremental learning

TLDR
The competitive learning is introduced, the SOM topology and leaning mechanism are illustrated, the new development of SOMIL is reviewed and some self-organizing maps with incremental learning (SOMIL), such as self- Organizing surfaces, evolve self-Organizing maps, incremental grid growing and growing hierarchical self- organizing map, are outlined.

Instructions for use Title A robust energy artificial neuron based incremental self-organizing neural network with a dynamicstructure

TLDR
This paper proposes a robust energy artificial neuron based incremental self-organizing neural network with a dynamic structure (REISOD) that can adjust the scale of network automatically to adapt the size of the data set and learn new data incrementally with preserving the former learnt results.
...

References

SHOWING 1-10 OF 37 REFERENCES

Kohonen Feature Maps and Growing Cell Structures - a Performance Comparison

TLDR
A performance comparison of two self-organizing networks, the Kohonen Feature Map and the recently proposed Growing Cell Structures, shows that the growing cell Structures exhibit significantly better performance by every criterion.

Variants of self-organizing maps

TLDR
Two innovations are discussed: dynamic weighting of the input signals at each input of each cell, which improves the ordering when very different input signals are used, and definition of neighborhoods in the learning algorithm by the minimum spanning tree, which provides a far better and faster approximation of prominently structured density functions.

A Resource-Allocating Network for Function Interpolation

TLDR
A network that allocates a new computational unit whenever an unusual pattern is presented to the network, which learns much faster than do those using backpropagation networks and uses a comparable number of synapses.

How patterned neural connections can be set up by self-organization

TLDR
Without needing to make any elaborate assumptions about its structure or about the operations its elements are to carry out, it is shown that the mappings are set up in a system- to-system rather than a cell-to-cell fashion.

The Cascade-Correlation Learning Architecture

TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks

TLDR
A theory is reported that shows the equivalence between regularization and a class of three-layer networks called regularization networks or hyper basis functions.

Metamorphosis Networks: An Alternative to Constructive Models

TLDR
The architecture investigated is composed of RBF units on a lattice, which imposes flexible constraints on the parameters of the network, andvirtues of this approach include variable subset selection, robust parameter selection, multiresolution processing, and interpolation of sparse training data.

Improving the Learning Speed in Topological Maps of Patterns

TLDR
A method of improving the learning speed, by starting the map with very few units and increasing that number progressively until the map reaches its final size, which dramatically reduces the time needed for the “unfolding" phase and also yields some improvements in the asymptotic convergence phase.

Adding Learned Expectation Into the Learning Procedure of Self-Organizing Maps

  • L. Xu
  • Computer Science
    Int. J. Neural Syst.
  • 1990
The self-organizing topological map is generalized by adding a learned expectation to its learning procedure, in order to improve its stability in nonstationary environments with unexpected inputs

Competitive Hebbian Learning Rule Forms Perfectly Topology Preserving Maps

The problem of forming perfectly topology preserving maps of feature manifolds is studied. First, through introducing “masked Voronoi polyhedra” as a geometrical construct for determining