A Self-Organizing Network that Can Follow Non-stationary Distributions

@inproceedings{Fritzke1997ASN,
  title={A Self-Organizing Network that Can Follow Non-stationary Distributions},
  author={Bernd Fritzke},
  booktitle={ICANN},
  year={1997}
}
A new on-line criterion for identifying “useless” neurons of a self-organizing network is proposed. [...] Key Method Slow changes of the distribution are handled by adaptation of existing units. Rapid changes are handled by removal of “useless” neurons and subsequent insertions of new units in other places.Expand
A constructive and hierarchical self-organizing model in a non-stationary environment
  • Chihli Hung, S. Wermter
  • Computer Science
  • Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.
  • 2005
TLDR
The dynamic adaptive self-organizing hybrid (DASH) model is compared with the growing neural gas (GNG) model by introducing several different initial thresholds to test their feasibility and shows that the DASH model is more stable and practicable for document clustering in a non-stationary environment. Expand
Dynamic self-organising map
TLDR
A variation of the self-organising map algorithm where the original time-dependent (learning rate and neighbourhood) learning function is replaced by a time-invariant one that allows for on-line and continuous learning on both static and dynamic data distributions. Expand
A self-organized growing network for on-line unsupervised learning
  • S. Furao, O. Hasegawa
  • Computer Science
  • 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
  • 2004
TLDR
The design of two-layer neural network makes it possible for this system to represent the topological structure of unsupervised on-line data, report the reasonable number of clusters and give typical prototype patterns of every cluster without any priori conditions such as suitable number of nodes or a good initial codebook. Expand
Growing self-organizing networks—history, status quo, and perspectives
TLDR
The chapter discusses how the original growing neural gas (GNG) method has been enhanced by keeping a “strength” parameter with every augmenting edge each time a winner is connected to a second winner by an edge from the input signal. Expand
A self-organising network that grows when required
TLDR
A way in which the learning algorithm can add nodes whenever the network in its current state does not sufficiently match the input is suggested, so that the network grows very quickly when new data is presented, but stops growing once the network has matched the data. Expand
The plastic self organising map
  • R. Lang, K. Warwick
  • Computer Science
  • Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290)
  • 2002
TLDR
A novel extension to Kohonen's self-organising map, called the plastic self organising map (PSOM), is presented, which uses a graph structure to represent data and can add or remove neurons to learn dynamic nonstationary pattern sets. Expand
A self-structurizing neural network for online incremental learning
TLDR
The design of a two-layer neural network makes it possible for this system to represent the topological structure of unsupervised on-line data, report the reasonable number of clusters and give typical prototype patterns of every cluster without any priori conditions such as a suitable number of nodes or a good initial codebook. Expand
Auto-adaptive and Dynamical Clustering Neural Network
TLDR
A new algorithm designed with specific properties for the dynamical modeling of classes, called AUDyC (Auto-adaptive and Dynamical Clustering), is based on an unsupervised neural network with full auto- Adaptive abilities, obtained using Gaussian prototypes. Expand
Branching competitive learning Network:A novel self-creating model
TLDR
The ability of the BCL model to appropriately estimate the cluster number in a data distribution, show its adaptability to nonstationary data inputs and, moreover, present a scheme leading to a multiresolution data clustering. Expand
Self-Adjusting Feature Maps Network
TLDR
This study exploits SAM to handle some peculiar cases that cannot be well dealt with by classical unsupervised learning networks such as self-organizing feature map (SOM) network. Expand
...
1
2
3
4
5
...

References

SHOWING 1-5 OF 5 REFERENCES
A Growing Neural Gas Network Learns Topologies
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast toExpand
Growing cell structures--A self-organizing network for unsupervised and supervised learning
TLDR
A new self-organizing neural network model that has two variants that performs unsupervised learning and can be used for data visualization, clustering, and vector quantization is presented and results on the two-spirals benchmark and a vowel classification problem are presented that are better than any results previously published. Expand
Growing a hypercubical output space in a self-organizing feature map
TLDR
A growth algorithm, called the GSOM or growing self-organizing map, is presented, which enhances a widespread mapSelf-organization process, Kohonen's self- Organizing feature map (SOFM), by an adaptation of the output space grid during learning. Expand
'Neural-gas' network for vector quantization and its application to time-series prediction
TLDR
It is shown that the dynamics of the reference (weight) vectors during the input-driven adaptation procedure are determined by the gradient of an energy function whose shape can be modulated through a neighborhood determining parameter and resemble the dynamicsof Brownian particles moving in a potential determined by a data point density. Expand
Competitive Hebbian Learning Rule Forms Perfectly Topology Preserving Maps
The problem of forming perfectly topology preserving maps of feature manifolds is studied. First, through introducing “masked Voronoi polyhedra” as a geometrical construct for determiningExpand