Bidirectional associative memories

@article{Kosko1988BidirectionalAM,
  title={Bidirectional associative memories},
  author={Bart Kosko},
  journal={IEEE Trans. Syst. Man Cybern.},
  year={1988},
  volume={18},
  pages={49-60}
}
  • B. Kosko
  • Published 3 January 1988
  • Computer Science
  • IEEE Trans. Syst. Man Cybern.
Stability and encoding properties of two-layer nonlinear feedback neural networks are examined. Bidirectionality is introduced in neural nets to produce two-way associative search for stored associations. The bidirectional associative memory (BAM) is the minimal two-layer nonlinear feedback network. The author proves that every n-by-p matrix M is a bidirectionally stable heteroassociative content-addressable memory for both binary/bipolar and continuous neurons. When the BAM neutrons are… 

Figures from this paper

Adaptive bidirectional associative memories.
  • B. Kosko
  • Computer Science
    Applied optics
  • 1987
TLDR
The BAM correlation encoding scheme is extended to a general Hebbian learning law and every BAM adaptively resonates in the sense that all nodes and edges quickly equilibrate in a system energy local minimum.
A bidirectional heteroassociative memory for binary and grey-level patterns
TLDR
This paper introduces a new bidirectional heteroassociative memory model that uses a simple self-convergent iterative learning rule and a new nonlinear output function that can learn online without being subject to overlearning.
A feedforward bidirectional associative memory
TLDR
It is shown that the Hamming attractive radius of each prototype reaches the maximum possible value and the overall network design procedure is fully scalable in the sense that any number p= or <2(min{m,n}) of bidirectional associations can be implemented.
A Bidirectional Hetero-Associative Memory for True-Color Patterns
TLDR
A new bidirectional hetero-associative memory model for true-color patterns that uses the associative model with dynamical synapses recently introduced in Vazquez and Sossa to guarantee perfect and robust recall of the fundamental set of associations.
A Bidirectional Associative Memory Based on Optimal Linear Associative Memory
TLDR
The introduction of a nonlinear characteristic enhances considerably the ability of the BAM to suppress the noises occurring in the output pattern, and reduces largely the spurious memories, and therefore improves greatly the recall performance of theBAM.
Multi-layer associative neural networks (MANN): storage capacity vs. noise-free recall
  • Hoon Kang
  • Computer Science
    IEEE International Conference on Neural Networks
  • 1993
TLDR
The author attempts to resolve important issues on artificial neural nets, i.e., exact recall and capacity in multilayer associative memories, and completely relaxes any code-dependent conditions of the learning pairs.
Encoding Static and Temporal Patterns with a Bidirectional Heteroassociative Memory
TLDR
It will be shown that the Bidirectional Associative Memory can be generalized to multiple associative memories, and that it can be used to store associations from multiple sources as well.
Designs and devices for optical bidirectional associative memories.
TLDR
The bidirectional associative memory (BAM) is a powerful neural network paradigm that is well suited to optical implementation and variations on the BAM indicate some of the interesting directions this simple structure may evolve, leading in a natural progression toward the power of a model such as the Carpenter-Grossberg ART.
Learning Associative Memories by Error Backpropagation
TLDR
It is shown that the robustness in respect to acceptable noise in the input of the constructed networks is enhanced as the memory dimension increases and weakened as the number of the stored patterns grows.
A neural network based multi-associative memory model
  • A. A. Bhatti
  • Computer Science
    1990 IJCNN International Joint Conference on Neural Networks
  • 1990
TLDR
An improved two-layer neural network model is presented for a multiassociative content-addressable memory that yields improved error-correction and storage capabilities and faster convergence rate, avoids the storage of complementary and false memories, and possesses analogies to biological neural networks.
...
...

References

SHOWING 1-10 OF 28 REFERENCES
Adaptive bidirectional associative memories.
  • B. Kosko
  • Computer Science
    Applied optics
  • 1987
TLDR
The BAM correlation encoding scheme is extended to a general Hebbian learning law and every BAM adaptively resonates in the sense that all nodes and edges quickly equilibrate in a system energy local minimum.
The capacity of the Hopfield associative memory
TLDR
Techniques from coding theory are applied to study rigorously the capacity of the Hopfield associative memory, in particular to capacity under quantization of the outer-product connection matrix.
A model of human associative processor (HASP)
  • Y. Hirai
  • Computer Science
    IEEE Transactions on Systems, Man, and Cybernetics
  • 1983
TLDR
A model of an associative processor names HASP is proposed and described, to represent associative brain functions in terms of neural network structure, and various operations of retrieving information from a memory organized in complicated data structures are possible.
Correlation Matrix Memories
  • T. Kohonen
  • Computer Science
    IEEE Transactions on Computers
  • 1972
TLDR
A new model for associative memory, based on a correlation matrix, is suggested, in which any part of the memorized information can be used as a key and the memories are selective with respect to accumulated data.
Optical information processing based on an associative-memory model of neural nets with thresholding and feedback.
TLDR
The remarkable collective computational properties of the Hopfield model for neural networks are reviewed, including recognition from partial input, robustness, and error-correction capability.
Associatron-A Model of Associative Memory
  • K. Nakano
  • Computer Science
    IEEE Trans. Syst. Man Cybern.
  • 1972
TLDR
The Associatron is considered to be a simplified model of the neural network and can be constructed as a cellular structure, where each cell is connected to only its neighbor cells and all cells run in parallel.
Contour Enhancement, Short Term Memory, and Constancies in Reverberating Neural Networks
TLDR
It is suggested that competition solves a sensitivity problem that confronts all cellular systems: the noise-saturation dilemma.
Distinctive features, categorical perception, and probability learning: some applications of a neural model
TLDR
The model can predict overshooting, recency data, and probabilities occurring in systems with more than two events with reasonably good accuracy and applies the model to "categorical perception" and probability learning.
Neural networks and physical systems with emergent collective computational abilities.
  • J. Hopfield
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 1982
TLDR
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
...
...