Bernd Fritzke

Learn More
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this model has no parameters which change over time and is able to continue(More)
Alrstract-We present a new self-organizing neural network model that has two variants. The first variant performs unsupervised learning and can be used for data visualization, clustering, and vector quantization. The main advantage over existing approaches ( e.g., the Kohonen feature map) is the ability o f the model to automatically find a suitable network(More)
We present a novel self-organizing network which is generated by a growth process. The application range of the model is the same as for Kohonen’s feature map: generation of topology-preserving and dimensionality-reducing mappings, e.g., for the purpose of data visualization. The network structure is a rectangular grid which, however, increases its size(More)
This report has the purpose of describing several algorithms from the literature all related to competitive learning. A uniform terminology is used for all methods. Moreover, identical examples are provided to allow a qualitative comparisons of the methods. The on-line version1 of this document contains hyperlinks to Java implementations of several of the(More)
We present a new algorithm for the construction of radial basis function (RBF) networks. The method uses accumulated error information to determine where to insert new units. The diameter of the localized units is chosen based on the mutual distances of the units. To have the distance information always available, it is held up-to-date by a Hebbian learning(More)
The reasons to use growing self-organizing networks are investigated. First an overview of several models of this kind is given are they are related to other approaches. Then two examples are presented to illustrate the speci c properties and advantages of incremental networks. In each case a non-incremental model is used for comparison purposes. The rst(More)
A new incremental network model for supervised learning is proposed. The model builds up a structure of units each of which has an associated local linear mapping (LLM). Error information obtained during training is used to determine where to insert new units whose LLMs are interpolated from their neighbors. Simulation results for several classiication(More)
A new vector quantization method (LBG-U) closely related to a particular class of neural network models (growing self-organizing networks) is presented. LBG-U consists mainly of repeated runs of the well-known LBG algorithm. Each time LBG converges, however, a novel measure of utility is assigned to each codebook vector. Thereafter, the vector with minimum(More)
We present a new incremental radial basis function network suitable for classification and regression problems. Center positions are continuously updated through soft competitive learning. The width of the radial basis functions is derived from the distance to topological neighbors. During the training the observed error is accumulated locally and used to(More)