Comparing neural networks: a benchmark on growing neural gas, growing cell structures, and fuzzy ARTMAP

@article{Heinke1998ComparingNN,
  title={Comparing neural networks: a benchmark on growing neural gas, growing cell structures, and fuzzy ARTMAP},
  author={Dietmar Heinke and Fred Henrik Hamker},
  journal={IEEE transactions on neural networks},
  year={1998},
  volume={9 6},
  pages={
          1279-91
        }
}
  • D. Heinke, F. Hamker
  • Published 1 November 1998
  • Computer Science
  • IEEE transactions on neural networks
This article compares the performance of some recently developed incremental neural networks with the wellknown multilayer perceptron (MLP) on real-world data. The incremental networks are fuzzy ARTMAP (FAM), growing neural gas (GNG) and growing cell structures (GCS). The real-world datasets consist of four different datasets posing different challenges to the networks in terms of complexity of decision boundaries, overlapping between classes, and size of the datasets. The performance of the… 
Advanced Developments and Applications of the Fuzzy ARTMAP Neural Network in Pattern Classification
TLDR
When compared to other state-of-the-art machine learning classifiers, FAM and its variants showed superior speed and ease of training, and in most cases they delivered comparable classification accuracy.
Cluster Analysis using Growing Neural Gas and Graph Partitioning
TLDR
This paper describes a simple algorithm to better produce the partitioning of this graph, generating connected components that represent different data clusters, and automatically finds the number of classes and the associated neurons.
On Operating Strategies of the Fuzzy Artmap Neural Network: A Comparative Study
In this paper, the effectiveness of three different operating strategies applied to the Fuzzy ARTMAP (FAM) neural network in pattern classification tasks is analyzed and compared. Three types of FAM,
Externally Growing Cell Structures for Data Evaluation of Chemical Gas Sensors
TLDR
Simulation results indicate that EGCS performs better than the original GCS, measured by classification rate and the required number of epochs, and a new classification and regression method, the EGCS for Data Evaluation of Chemical Gas Sensors is introduced.
A Fast Simplified Fuzzy ARTMAP Network
TLDR
An algorithmic variant of the simplified fuzzy ARTMAP (SFAM) network, whose structure resembles those of feed-forward networks, is presented, and it is shown that its algorithm is much faster than Kasuba's algorithm, and by increasing the number of training samples, the difference in speed grows enormously.
Multilayer Batch Learning Growing Neural Gas for Learning Multiscale Topologies
TLDR
This paper proposes multilayer BL-GNG, which is a parameter-less unsupervised learning algorithm based on hierarchical topological structure learning that can automatically determine the number of nodes and layers according to the data distribution.
A hybrid self-organizing Neural Gas based network
  • James T. Graham, J. Starzyk
  • Computer Science
    2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence)
  • 2008
TLDR
This paper examines the neural gas networks proposed by Martinetz and Schulten and Fritzke in an effort to create a more biologically plausible hybrid version that is more biologically feasible and somewhat more flexible due to its hybrid nature and lack of reliance on adjustment parameters.
A consensus-based semi-supervised growing neural gas
TLDR
A new semi-supervised growing neural gas model, named Consensus-Based Semi-Supervised GNG, or CSSGNG, in which both labeled and unlabeled data are used to train the network, which on average can deliver better classification results in comparison to the SSGNG and OSSGNG models.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 31 REFERENCES
PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms
TLDR
The purpose of the problem and rule collection is to give researchers easy access to data for the evaluation of their algorithms and networks and to make direct comparison of the published results feasible.
Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps
TLDR
The fuzzy ARTMAP system is compared with Salzberg's NGE systems and with Simpson's FMMC system, and its performance in relation to benchmark backpropagation and generic algorithm systems.
From Statistics to Neural Networks: Theory and Pattern Recognition Applications
This volume provides a unified approach to the study of predictive learning, i.e., generalization from examples. It contains an up-to-date review and in-depth treatment of major issues and methods
A Growing Neural Gas Network Learns Topologies
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to
Dynamic Cell Structure Learns Perfectly Topology Preserving Map
TLDR
Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the growing cell structure algorithm leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.
A direct adaptive method for faster backpropagation learning: the RPROP algorithm
TLDR
A learning algorithm for multilayer feedforward networks, RPROP (resilient propagation), is proposed that performs a local adaptation of the weight-updates according to the behavior of the error function to overcome the inherent disadvantages of pure gradient-descent.
Backpropagation: past and future
  • P. Werbos
  • Mathematics
    IEEE 1988 International Conference on Neural Networks
  • 1988
TLDR
The author proposes development of a general theory of intelligence in which backpropagation and comparisons to the brain play a central role, and points to a series of intermediate steps and applications leading up to the construction of such generalized systems.
Statistical evaluation of neural networks experiments: Minimum requirements and current practice
TLDR
Minimum requirements concerning statistical evaluation are developed and the appropriate statistical techniques are introduced for statistical evaluation of neural network experiments.
...
1
2
3
4
...