Incremental grid growing: encoding high-dimensional structure into a two-dimensional feature map

@article{Blackmore1993IncrementalGG,
  title={Incremental grid growing: encoding high-dimensional structure into a two-dimensional feature map},
  author={Justine Blackmore and Risto Miikkulainen},
  journal={IEEE International Conference on Neural Networks},
  year={1993},
  pages={450-455 vol.1}
}
Ordinary feature maps with fully connected, fixed grid topology cannot properly reflect the structure of clusters in the input space. Incremental feature map algorithms, where nodes and connections are added to or deleted from the map according to the input distribution can overcome this problem. Such algorithms have been limited to maps that can be drawn in 2-D only in the case of two-dimensional input space. In the proposed approach, nodes are added incrementally to a regular two-dimensional… 

Figures from this paper

Adaptive Hierarchical Incremental Grid Growing: An architecture for high-dimensional data visualization

TLDR
A novel neural network model with a highly adaptive hierarchically structured architecture, the adaptive hierarchical incremental grid growing, which allows it to capture the unknown data topology in terms of hierarchical relationships and cluster structures in a highly accurate way.

Ranked centroid projection: a data visualization approach for self-organizing maps

  • G. YenZheng Wu
  • Computer Science
    Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
  • 2005
TLDR
This paper presents an intuitive and effective SOM projection method with comparatively low computational complexity for the purpose of cluster visualization that maps data vectors on the output space based on their responses to different prototype vectors.

MIGSOM: Multilevel Interior Growing Self-Organizing Maps for High Dimensional Data Clustering

TLDR
A new dynamic SOMs called MIGSOM: Multilevel Interior Growing SOMs for high-dimensional data clustering, which has the capability of growing map size from the boundaries as well as the interior of the network in order to represent more faithfully the structure present in a data collection.

A Growing Hierarchical Approach to Batch Linear Manifold Topographic Map Formation

TLDR
A growing hierarchical structure is proposed for LMTM, to remove limitations, and to be able to take advantage of the possible hierarchical nature of the datasets.

AMSOM: Adaptive Moving Self-organizing Map for Clustering and Visualization

TLDR
A variant of SOM is proposed called AMSOM (Adaptive Moving Self-Organizing Map) that creates a more flexible structure where neuron positions are dynamically altered during training and tackles the drawback of having a predefined grid by allowing neuron addition and/or removal during training.

Cluster Connections : A visualizationtechnique to reveal cluster boundariesin self-organizing

TLDR
This paper suggests an extension to the standard map representation that leads to an easy recognition of cluster boundaries and allows intuitive analysis of the similarities inherent in the input data without the necessity of substantial prior knowledge, and an intuitive recognition of clusters boundaries.

Uncovering the Hierarchical Structure of Text Archives by Using an Unsupervised Neural Network with Adaptive Architecture

TLDR
The Growing Hierarchical Self-Organizing Map (GH-SOM), a neural network model based on the self-organizing map, which has the capability of growing both in terms of map size as well as in a three-dimensional tree-structure in order to represent the hierarchical structure present in a data collection.

Enhancing Visual Clustering Using Adaptive Moving Self-Organizing Maps (AMSOM)

TLDR
A variant of som is presented called amsom (adaptive moving self-organizing map) that creates a more flexible structure where neuron positions are dynamically altered during training and on the other hand tackles the drawback of having a predefined grid by allowing neuron addition and/or removal during training.

A self-organising network that grows when required

...

References

SHOWING 1-10 OF 25 REFERENCES

Improving the Learning Speed in Topological Maps of Patterns

TLDR
A method of improving the learning speed, by starting the map with very few units and increasing that number progressively until the map reaches its final size, which dramatically reduces the time needed for the “unfolding" phase and also yields some improvements in the asymptotic convergence phase.

Let It Grow - Self-Organizing Feature Maps With Problem Dependent Cell Structure

TLDR
A method to construct two-dimensional cell structures during a self-organization process which are specially adapted to the underlying distribution: Starting with a small number of cells new cells are added successively to determine where to insert or delete cells in the current structure.

Variants of self-organizing maps

TLDR
Two innovations are discussed: dynamic weighting of the input signals at each input of each cell, which improves the ordering when very different input signals are used, and definition of neighborhoods in the learning algorithm by the minimum spanning tree, which provides a far better and faster approximation of prominently structured density functions.

Unsupervised clustering with growing cell structures

  • B. Fritzke
  • Computer Science
    IJCNN-91-Seattle International Joint Conference on Neural Networks
  • 1991
TLDR
A neural network model is presented which is able to detect clusters of similar patterns and the accuracy of the cluster description increases linearly with the number of evaluated sample vectors.

The self-organizing map

Natural Language Processingwith Modular Neural Networks and Distributed Lexicon

An approach to connectionist natural language processing is proposed, which is based on hierarchically organized modular Parallel Distributed Processing (PDP) networks and a central lexicon of

Extending Kohonens Self-Organizing Mapping Algorithm to Learn Ballistic Movements

TLDR
Kohonen’s algorithm is extended by a suitable learning rule for the individual units and it is shown that this approach results in a significant improvement in the convergency properties of the learning rule used.

Parallel Processing in Neural Systems and Computers

TLDR
This chapter discusses the development and application of Parallel Computers, self-Organization and Learning in Neural Networks, and Selected Applications for Neural Networks.

Self-Organization and Associative Memory

TLDR
The purpose and nature of Biological Memory, as well as some of the aspects of Memory Aspects, are explained.