Corpus ID: 17562591

Storkey Learning Rules for Hopfield Networks

@article{Hu2013StorkeyLR,
  title={Storkey Learning Rules for Hopfield Networks},
  author={Xiao Hu},
  journal={viXra},
  year={2013}
}
  • Xiao Hu
  • Published 2013
  • Computer Science
  • viXra
We summarize the Storkey Learning Rules for the Hopfield Model, and evaluate performance relative to other learning rules. Hopfield Models are normally used for auto-association, and Storkey Learning Rules have been found to have good balance between local learning and capacity. In this paper we outline different learning rules and summarise capacity results. Hopfield networks are related to Boltzmann Machines: they are the same as fully visible Boltzmann Machines in the zero temperature limit… Expand
Leveraging Different Learning Rules in Hopfield Nets for Multiclass Classification
Retaining customer and finding the loyalty of an existing customer is an important aspect of today’s business industry. In this paper the study of behavior of different machine learning rules onExpand
Geometric Regularized Hopfield Neural Network for Medical Image Enhancement
TLDR
This paper addresses the convergence problem of Hopfield neural network using two approaches: by sequencing the activation of a continuous modified HNN based on the geometric correlation of features within various image hyperplanes via pixel gradient vectors and by regulating geometricpixel gradient vectors. Expand

References

SHOWING 1-10 OF 22 REFERENCES
High capacity associative memories and connection constraints
TLDR
An experimental investigation into how high capacity associative neural networks perform when the connection weights are not free to take any value is reported, using a symmetry constraint, a sign constraint and a dilution constraint. Expand
Palimpsest memories: a new high-capacity forgetful learning rule for Hopfield networks
TLDR
It is shown that the algorithm acts as an iterated function sequence on the space of matrices, and this is used to illustrate the performance of the learning rule. Expand
High Capacity Recurrent Associative Memories
TLDR
Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model and their ability to correct corrupted versions of the training patterns is investigated. Expand
Increasing the Capacity of a Hopfield Network without Sacrificing Functionality
Hopfield networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n/(2ln n), where n is the total number of neurons. ThisExpand
On the Capacity of Hopfield Neural Networks as EDAs for Solving Combinatorial Optimisation Problems
TLDR
This paper shows how a Hopfield network can model a number of point attractors based on non-optimal samples from an objective function and the resulting network is shown to be able to model and generate anumber of local optimal solutions up to a certain capacity. Expand
High capacity, small world associative memory models
TLDR
This work investigates sparse networks of threshold units, trained with the perceptron learning rule, and shows that in highly dilute networks small world architectures will produce efficiently wired associative memories, which still exhibit good pattern completion abilities. Expand
Temporal Hidden Hopfield Models
Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is oftenExpand
The basins of attraction of a new Hopfield learning rule
The nature of the basins of attraction of a Hopfield network is as important as the capacity. Here a new learning rule is re-introduced. This learning rule has a higher capacity than that of the HebbExpand
Towards Cortex Isomorphic Attractor Neural Networks
TLDR
A generic neural network model of the mammalian cortex with aBCPNN (Bayesian Confidence Propagating Neural Network) and a thorough review of attractorneural networks and their properties is provided. Expand
Feedback associative memory based on a new hybrid model of generalized regression and self-feedback neural networks
TLDR
The proposed hybrid model is without any spurious attractors and can store both binary and real-value patterns without any preprocessing and is better than that of recurrent associative memory and competitive with other classes of networks. Expand
...
1
2
3
...