Corpus ID: 14133721

BINARY FACTORIZATION IN HOPFIELD-LIKE NEURAL NETWORKS: SINGLE-STEP APPROXIMATION AND COMPUTER SIMULATIONS

@inproceedings{Frolov2004BINARYFI,
  title={BINARY FACTORIZATION IN HOPFIELD-LIKE NEURAL NETWORKS: SINGLE-STEP APPROXIMATION AND COMPUTER SIMULATIONS},
  author={Alexander A. Frolov and Miroslav Sirota and P. I. Muraviev and Pavel Polyakov},
  year={2004}
}
The unsupervised learning of feature extraction in high-dimesional pat- terns is a central problem for the neural network approach. Feature extraction is a procedure which maps original patterns into the feature (or factor) space of reduced dimension. In this paper we demonstrate that Hebbian learning in Hopfield-like neural network is a natural procedure for unsupervised learning of feature extrac- tion. Due to this learning, factors become the attractors of network dynamics, hence they can be… Expand
Boolean Factor Analysis by Attractor Neural Network
TLDR
This paper describes neural network implementation of the Boolean factor analysis method with Hebbian learning and a Hopfield-like neural network and shows the efficiency of the method on artificial data containing a known list of factors. Expand
Neural network nonlinear factor analysis of high dimensional binary signals
TLDR
The new recall procedure of Hoppfield-like associative memory which allows search all attractors corresponding to factors to reveal groups of highly correlated words (factors) which frequently occur in documents jointly and represent topics of that documents. Expand
New BFA method based on attractor neural network and likelihood maximization
What is suggested is a new approach to Boolean factor analysis, which is an extension of the previously proposed Boolean factor analysis method: Hopfield-like attractor neural network with increasingExpand
Boolean Factor Analysis by Attractor Neural Network Summary of Ph
Methods for the discovery of hidden structures of high-dimensional binary data are one of the most important challenges facing the community of machine learning researchers at present. There are manyExpand
Neural network attempt to nonlinear binary factor analysis of textual data
Possible application of a new procedure suitable of binary factorization of signals of large dimension and complexity is discussed. The new procedure is based on the search of attractors inExpand
Comparison of two neural networks approaches to Boolean matrix factorization
TLDR
This paper compares two new neural networks methods, aimed at solving the problem of optimal binary matrix Boolean factorization or Boolean factor analysis, and shows that both methods give good results when processed data have a simple structure. Expand
Recurrent-Neural-Network-Based Boolean Factor Analysis and Its Application to Word Clustering
TLDR
The results of Boolean factor analysis and fuzzy clustering are shown to be not contradictory, but complementary, and the method is applied to two types of textual data on neural networks in two different languages. Expand
Bars Problem Solving - New Neural Network Method and Comparison
TLDR
It is shown that frequently used cluster analysis methods can bring interesting results, at least for first insight to the data structure in Bars problem, which is widely used as a benchmark for feature extraction tasks. Expand
Modeling human thinking about similarities by neuromatrices in the perspective of fuzzy logic
TLDR
This work proposes a new method for modeling human reasoning about objects’ similarities based on fuzzy logic and set theory principles, and it aims at representing the initial linguistic ordinal-scale matrix as a max–min product of other LOS matrix and its transpose. Expand
Clustering the Sources of EEG Activity during Motor Imagery by Attractor Neural Network with Increasing Activity (ANNIA)
TLDR
Cluster analysis can not substitute careful data examination and interpretation however it is a useful pre-processing step which can clearly aid in revealing data regularities which are impossible to tract by sequentially browsing through the data. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 18 REFERENCES
Informational Capacity and Recall Quality in Sparsely Encoded Hopfield-like Neural Network: Analytical Approaches and Computer Simulation
TLDR
A sparsely encoded Hopfield-like attractor neural network is investigated analytically and by computer simulation and it is shown that informational capacity monotonically increases when sparseness increases, while recall quality changes nonmonotonically: initially it decreases and then increases. Expand
Statistical mechanics of neural networks near saturation
The Hopfield model of a neural network is studied near its saturation, i.e., when the number p of stored patterns increases with the size of the network N, as p = αN. The mean-field theory for thisExpand
On setting unit thresholds in an incompletely connected associative net Network: Comput
TLDR
This work describes five different strategies for setting the thresholds of units in partially connected nets and shows the superiority of a mechanism of the winners-take-all type in a typical case. Expand
Learning and pattern recognition in spin glass models
Spin glass models have a complex phase space which may be used to store information. By an asynchronous relaxational dynamics noisy patterns are recognized very fast. In particular the Hopfield modelExpand
Statistical neurodynamics of associative memory
TLDR
A new statistical neurodynamical method is proposed for analyzing the non-equilibrium dynamical behaviors of an autocorrelation associative memory model and explains the strange behaviors due to strange shapes of the basins of attractors. Expand
Neural networks and physical systems with emergent collective computational abilities.
  • J. Hopfield
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences of the United States of America
  • 1982
TLDR
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. Expand
A theory for cerebral neocortex
  • D. Marr
  • Psychology, Medicine
  • Proceedings of the Royal Society of London. Series B. Biological Sciences
  • 1970
TLDR
It is shown how a climbing fibre input to the correct cell can cause that cell to perform a mountain-climbing operation in an underlying probability space, that will lead it to respond to a class of events for which it is appropriate to code. Expand
On setting unit thresholds in an incompletely connected associative net
TLDR
Paramew sensitivity analysis of this mechanism shows that performance improves as mnnectivity becomes more complete and that this mechanism is shown to be equivalent to minimiration of output ermr but without re&ring numerical solution of a selof equations, which would 6e biolo&cally implausible. Expand
Simple memory: a theory for archicortex.
  • D. Marr
  • Computer Science, Medicine
  • Philosophical transactions of the Royal Society of London. Series B, Biological sciences
  • 1971
TLDR
It is shown that rather general numerical constraints roughly determine the dimensions of memorizing models for the mammalian brain, and from these is derived a general model for archicortex. Expand
A high-precision study of the hopfield model in the phase of broken replica symmetry
Using a multi-spin coding algorithm, the Hopfield model is studied for network sizes up toN=33,968. Thermodynamically stable states are found in a region where the replica-symmetric solution predictsExpand
...
1
2
...