# Growing cell structures--A self-organizing network for unsupervised and supervised learning

@article{Fritzke1994GrowingCS, title={Growing cell structures--A self-organizing network for unsupervised and supervised learning}, author={Bernd Fritzke}, journal={Neural Networks}, year={1994}, volume={7}, pages={1441-1460} }

## Figures and Tables from this paper

## 1,377 Citations

### Growing Hierarchical Tree SOM: An unsupervised neural network with dynamic topology

- Computer ScienceNeural Networks
- 2006

### The Evolving Tree—A Novel Self-Organizing Network for Data Analysis

- Computer ScienceNeural Processing Letters
- 2004

A new variant of the Self-Organizing Map called the Evolving Tree is proposed which tries to combine both of the advantages of the SOM, especially the time-consuming search for the best-matching unit in large maps.

### Self-Organizing Networks for Nonparametric Regression

- Computer Science
- 1994

This paper describes SOM and CTM methods in the general framework of adaptive methods for regression that effectively combine iterative computation and local regularization to achieve robust performance and modeling flexibility.

### Dynamic Cell Structures

- Computer ScienceNIPS
- 1994

Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the Growing Cell Structure algorithm leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.

### A parallel growing architecture for self-organizing maps with unsupervised learning

- Computer ScienceNeurocomputing
- 2005

### Optimal method for growth in dynamic self organizing learning systems

- Computer ScienceThe 2010 International Joint Conference on Neural Networks (IJCNN)
- 2010

Current methods for growing the number nodes in the case of the Dynamic Cell Structures neural network are described and a new algorithm is provided that overcomes the observed flaw and enables these learning systems to grow and operate in an optimal and robust manner.

### Dynamic Cell Structure Learns Perfectly Topology Preserving Map

- Computer ScienceNeural Computation
- 1995

Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the growing cell structure algorithm leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.

### Soft Competitive Learning and Growing Self-Organizing Neural Networks for Pattern Classification

- Computer Science2006 Eighth International Symposium on Symbolic and Numeric Algorithms for Scientific Computing
- 2006

An introduction to KSOM and neural gas network, some GSONN without fixed dimensionality such as growing neuralGas and the author's model: twin growing neural gas and it's application for pattern classification are discussed.

### A survey of some classic self-organizing maps with incremental learning

- Computer Science2010 2nd International Conference on Signal Processing Systems
- 2010

The competitive learning is introduced, the SOM topology and leaning mechanism are illustrated, the new development of SOMIL is reviewed and some self-organizing maps with incremental learning (SOMIL), such as self- Organizing surfaces, evolve self-Organizing maps, incremental grid growing and growing hierarchical self- organizing map, are outlined.

### Instructions for use Title A robust energy artificial neuron based incremental self-organizing neural network with a dynamicstructure

- Computer Science
- 2017

This paper proposes a robust energy artificial neuron based incremental self-organizing neural network with a dynamic structure (REISOD) that can adjust the scale of network automatically to adapt the size of the data set and learn new data incrementally with preserving the former learnt results.

## References

SHOWING 1-10 OF 37 REFERENCES

### Kohonen Feature Maps and Growing Cell Structures - a Performance Comparison

- BusinessNIPS
- 1992

A performance comparison of two self-organizing networks, the Kohonen Feature Map and the recently proposed Growing Cell Structures, shows that the growing cell Structures exhibit significantly better performance by every criterion.

### Variants of self-organizing maps

- Computer ScienceInternational 1989 Joint Conference on Neural Networks
- 1989

Two innovations are discussed: dynamic weighting of the input signals at each input of each cell, which improves the ordering when very different input signals are used, and definition of neighborhoods in the learning algorithm by the minimum spanning tree, which provides a far better and faster approximation of prominently structured density functions.

### A Resource-Allocating Network for Function Interpolation

- Computer ScienceNeural Computation
- 1991

A network that allocates a new computational unit whenever an unusual pattern is presented to the network, which learns much faster than do those using backpropagation networks and uses a comparable number of synapses.

### How patterned neural connections can be set up by self-organization

- BiologyProceedings of the Royal Society of London. Series B. Biological Sciences
- 1976

Without needing to make any elaborate assumptions about its structure or about the operations its elements are to carry out, it is shown that the mappings are set up in a system- to-system rather than a cell-to-cell fashion.

### The Cascade-Correlation Learning Architecture

- Computer ScienceNIPS
- 1989

The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

### Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks

- Computer ScienceScience
- 1990

A theory is reported that shows the equivalence between regularization and a class of three-layer networks called regularization networks or hyper basis functions.

### Metamorphosis Networks: An Alternative to Constructive Models

- Computer ScienceNIPS
- 1992

The architecture investigated is composed of RBF units on a lattice, which imposes flexible constraints on the parameters of the network, andvirtues of this approach include variable subset selection, robust parameter selection, multiresolution processing, and interpolation of sparse training data.

### Improving the Learning Speed in Topological Maps of Patterns

- Computer Science
- 1990

A method of improving the learning speed, by starting the map with very few units and increasing that number progressively until the map reaches its final size, which dramatically reduces the time needed for the “unfolding" phase and also yields some improvements in the asymptotic convergence phase.

### Adding Learned Expectation Into the Learning Procedure of Self-Organizing Maps

- Computer ScienceInt. J. Neural Syst.
- 1990

The self-organizing topological map is generalized by adding a learned expectation to its learning procedure, in order to improve its stability in nonstationary environments with unexpected inputs…

### Competitive Hebbian Learning Rule Forms Perfectly Topology Preserving Maps

- Mathematics
- 1993

The problem of forming perfectly topology preserving maps of feature manifolds is studied. First, through introducing “masked Voronoi polyhedra” as a geometrical construct for determining…