The ART of adaptive pattern recognition by a self-organizing neural network

@article{Carpenter1988TheAO,
  title={The ART of adaptive pattern recognition by a self-organizing neural network},
  author={Gail A. Carpenter and Stephen Grossberg},
  journal={Computer},
  year={1988},
  volume={21},
  pages={77-88}
}
The adaptive resonance theory (ART) suggests a solution to the stability-plasticity dilemma facing designers of learning systems, namely how to design a learning system that will remain plastic, or adaptive, in response to significant events and yet remain stable in response to irrelevant events. ART architectures are discussed that are neural networks that self-organize stable recognition codes in real time in response to arbitrary sequences of input patterns. Within such an ART architecture… 
Art 3: Self-Organization of Distributed Pattern Recognition Codes in Neural Network Hierarchies
TLDR
This article outlines some properties of three generations of ART networks: ART 1, ART 2, and ART 3, which incorporate a third memory, on an intermediate time scale, whose dynamics may be interpreted as chemical transmitter processes.
Self-Organizing Cortical Networks for Distributed Hypothesis Testing and Recognition Learning
TLDR
Adaptive Resonance Theory, or ART, is the only computationally realized biological theory that analyses how fast, yet stable, real-time learning of recognition codes can be accomplished in response to an arbitrary stream of input patterns.
ART 2: self-organization of stable category recognition codes for analog input patterns.
TLDR
ART 2, a class of adaptive resonance architectures which rapidly self-organize pattern recognition categories in response to arbitrary sequences of either analog or binary input patterns, is introduced.
ARTMAP: a self-organizing neural network architecture for fast supervised learning and pattern recognition
TLDR
A neural network architecture that autonomously learns to classify arbitrarily many, arbitrarily ordered vectors into recognition categories based on predictive success by using an internal controller that conjointly maximizes predictive generalization and minimizes predictive error.
ART 3: Hierarchical search using chemical transmitters in self-organizing pattern recognition architectures
TLDR
A model to implement parallel search of compressed or distributed pattern recognition codes in a neural network hierarchy is introduced and is a form of hypothesis testing capable of discovering appropriate representations of a nonstationary input environment.
ARTMAP: Supervised real-time learning and classification of nonstationary data by a self-organizing neural network
TLDR
A new neural network architecture, called ARTMAP, that autonomously learns to classify arbitrarily many, arbitrarily ordered vectors into recognition categories based on predictive success, which is a type of self-organizing expert system that calibrates the selectivity of its hypotheses based upon predictive success.
A NEURAL NETWORK ARCHITECTURE FOR FAST ON-LINE SUPERVISED LEARNING AND PATTERN RECOGNITION
TLDR
New neural network architecture called ARTMAP is described that autonomously learns to classify arbitrarily ordered vectors into recognition categories based on predictive success by using an internal controller that conjointly maximizes predictive generalization and minimizes predictive error.
Generalized Net Model of the Cognitive and Neural Algorithm for Adaptive Resonance Theory 1
TLDR
A GN model is introduced that represent ART1 Neural Network learning algorithm to explain when the input vector will be clustered or rejected among all nodes by the network.
Improvement of ART-2 Neural Network's Adaptation System Patterns
TLDR
It is found that amplitudes and insensitivity to gradual change data during the simulation of data classified with ART-2 neural network are high and a new neural network model based on adaptive resonance theory is proposed.
Neural network based competitive learning for control
  • B. Zhang, E. Grant
  • Computer Science
    Proceedings Fourth International Conference on Tools with Artificial Intelligence TAI '92
  • 1992
TLDR
A simulation study, of neural-net-based partitioning algorithms for learning control incorporated into the BOXES machine learning control system, and performance comparisons are made with the original BOXES partitioning method.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 17 REFERENCES
A massively parallel architecture for a self-organizing neural pattern recognition machine
TLDR
A neural network architecture for the learning of recognition categories is derived which circumvents the noise, saturation, capacity, orthogonality, and linear predictability constraints that limit the codes which can be stably learned by alternative recognition models.
ART 2: self-organization of stable category recognition codes for analog input patterns.
TLDR
ART 2, a class of adaptive resonance architectures which rapidly self-organize pattern recognition categories in response to arbitrary sequences of either analog or binary input patterns, is introduced.
A Learning Algorithm for Boltzmann Machines
TLDR
A general parallel search method is described, based on statistical mechanics, and it is shown how it leads to a general learning rule for modifying the connection strengths so as to incorporate knowledge about a task domain in an efficient way.
Neural Networks and Natural Intelligence
From the Publisher: Stephen Grossberg and his colleagues at Boston University's Center for Adaptive Systems are producing some of the most exciting research in the neural network approach to making
Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex
The development of stimulus selectivity in the primary sensory cortex of higher vertebrates is considered in a general mathematical framework. A synaptic evolution scheme of a new kind is proposed in
Self-Organization and Associative Memory
TLDR
The purpose and nature of Biological Memory, as well as some of the aspects of Memory Aspects, are explained.
Computational anatomy and functional architecture of striate cortex: A spatial mapping approach to perceptual coding
  • E. Schwartz
  • Computer Science, Medicine
    Vision Research
  • 1980
TLDR
The approach of the present paper is to suggest that the basic data structure of perceptual coding consists of two-dimensional laminar mapping, and that successive stages of remapping, along with columnar architecture, may provide important computational functions.
Optical and laser remote sensing
  • E. Hinkley
  • Materials Science
    IEEE Journal of Quantum Electronics
  • 1983
mode locking. Spectral and spatial hole-burning effects are also explained. In the following chapter, a number of practical lasers including solid-state, gas, dye, chemical, and semiconductor
Learning internal representations by error propagation
This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion
Position, rotation, and scale invariant optical correlation.
TLDR
A new optical transformation that combines geometrical coordinate transformations with the conventional optical Fourier transform is described, which is invariant to both scale and rotational changes in the input object or function.
...
1
2
...