# Storkey Learning Rules for Hopfield Networks

@article{Hu2013StorkeyLR, title={Storkey Learning Rules for Hopfield Networks}, author={Xiao Hu}, journal={viXra}, year={2013} }

We summarize the Storkey Learning Rules for the Hopfield Model, and evaluate performance relative to other learning rules. Hopfield Models are normally used for auto-association, and Storkey Learning Rules have been found to have good balance between local learning and capacity. In this paper we outline different learning rules and summarise capacity results. Hopfield networks are related to Boltzmann Machines: they are the same as fully visible Boltzmann Machines in the zero temperature limit… Expand

#### Figures and Topics from this paper

#### 2 Citations

Leveraging Different Learning Rules in Hopfield Nets for Multiclass Classification

- 2018

Retaining customer and finding the loyalty of an existing customer is an important aspect of today’s business industry. In this paper the study of behavior of different machine learning rules on… Expand

Geometric Regularized Hopfield Neural Network for Medical Image Enhancement

- Computer Science, Medicine
- Int. J. Biomed. Imaging
- 2021

This paper addresses the convergence problem of Hopfield neural network using two approaches: by sequencing the activation of a continuous modified HNN based on the geometric correlation of features within various image hyperplanes via pixel gradient vectors and by regulating geometricpixel gradient vectors. Expand

#### References

SHOWING 1-10 OF 22 REFERENCES

High capacity associative memories and connection constraints

- Computer Science
- Connect. Sci.
- 2004

An experimental investigation into how high capacity associative neural networks perform when the connection weights are not free to take any value is reported, using a symmetry constraint, a sign constraint and a dilution constraint. Expand

Palimpsest memories: a new high-capacity forgetful learning rule for Hopfield networks

- Computer Science
- 1998

It is shown that the algorithm acts as an iterated function sequence on the space of matrices, and this is used to illustrate the performance of the learning rule. Expand

High Capacity Recurrent Associative Memories

- Computer Science
- Neurocomputing
- 2004

Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model and their ability to correct corrupted versions of the training patterns is investigated. Expand

Increasing the Capacity of a Hopfield Network without Sacrificing Functionality

- Computer Science
- ICANN
- 1997

Hopfield networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n/(2ln n), where n is the total number of neurons. This… Expand

On the Capacity of Hopfield Neural Networks as EDAs for Solving Combinatorial Optimisation Problems

- Computer Science
- IJCCI
- 2012

This paper shows how a Hopfield network can model a number of point attractors based on non-optimal samples from an objective function and the resulting network is shown to be able to model and generate anumber of local optimal solutions up to a certain capacity. Expand

High capacity, small world associative memory models

- Computer Science
- Connect. Sci.
- 2006

This work investigates sparse networks of threshold units, trained with the perceptron learning rule, and shows that in highly dilute networks small world architectures will produce efficiently wired associative memories, which still exhibit good pattern completion abilities. Expand

Temporal Hidden Hopfield Models

- Mathematics
- 2002

Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often… Expand

The basins of attraction of a new Hopfield learning rule

- Mathematics, Computer Science
- Neural Networks
- 1999

The nature of the basins of attraction of a Hopfield network is as important as the capacity. Here a new learning rule is re-introduced. This learning rule has a higher capacity than that of the Hebb… Expand

Towards Cortex Isomorphic Attractor Neural Networks

- Computer Science
- 2004

A generic neural network model of the mammalian cortex with aBCPNN (Bayesian Confidence Propagating Neural Network) and a thorough review of attractorneural networks and their properties is provided. Expand

Feedback associative memory based on a new hybrid model of generalized regression and self-feedback neural networks

- Computer Science, Medicine
- Neural Networks
- 2010

The proposed hybrid model is without any spurious attractors and can store both binary and real-value patterns without any preprocessing and is better than that of recurrent associative memory and competitive with other classes of networks. Expand