• Corpus ID: 17562591

Storkey Learning Rules for Hopfield Networks

  title={Storkey Learning Rules for Hopfield Networks},
  author={Xiao Hu},
  • Xiao Hu
  • Published 1 September 2013
  • Computer Science
  • viXra
We summarize the Storkey Learning Rules for the Hopfield Model, and evaluate performance relative to other learning rules. Hopfield Models are normally used for auto-association, and Storkey Learning Rules have been found to have good balance between local learning and capacity. In this paper we outline different learning rules and summarise capacity results. Hopfield networks are related to Boltzmann Machines: they are the same as fully visible Boltzmann Machines in the zero temperature limit… 

Figures from this paper

Leveraging Different Learning Rules in Hopfield Nets for Multiclass Classification

This model enhances the approach of finding the loyalty of customer using Hebbian learning and Storkey learning of Hopfield Neural Network (HNN) and is tested on Breast cancer dataset.

Geometric Regularized Hopfield Neural Network for Medical Image Enhancement

This paper addresses the convergence problem of Hopfield neural network using two approaches: by sequencing the activation of a continuous modified HNN based on the geometric correlation of features within various image hyperplanes via pixel gradient vectors and by regulating geometricpixel gradient vectors.



High capacity associative memories and connection constraints

An experimental investigation into how high capacity associative neural networks perform when the connection weights are not free to take any value is reported, using a symmetry constraint, a sign constraint and a dilution constraint.

Palimpsest memories: a new high-capacity forgetful learning rule for Hopfield networks

It is shown that the algorithm acts as an iterated function sequence on the space of matrices, and this is used to illustrate the performance of the learning rule.

High Capacity Recurrent Associative Memories

Increasing the Capacity of a Hopfield Network without Sacrificing Functionality

Hopfield networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n/(2ln n), where n is the total number of neurons. This

On the Capacity of Hopfield Neural Networks as EDAs for Solving Combinatorial Optimisation Problems

This paper shows how a Hopfield network can model a number of point attractors based on non-optimal samples from an objective function and the resulting network is shown to be able to model and generate anumber of local optimal solutions up to a certain capacity.

High capacity, small world associative memory models

This work investigates sparse networks of threshold units, trained with the perceptron learning rule, and shows that in highly dilute networks small world architectures will produce efficiently wired associative memories, which still exhibit good pattern completion abilities.

Temporal Hidden Hopfield Models

This work considers a class of models in which the discrete hidden space is defined by parallel dynamics of densely connected high-dimensional stochastic Hopfield networks, and derives mean field methods for learning discrete and continuous temporal sequences.

Towards Cortex Isomorphic Attractor Neural Networks

A generic neural network model of the mammalian cortex with aBCPNN (Bayesian Confidence Propagating Neural Network) and a thorough review of attractorneural networks and their properties is provided.

Auto-associative memory based on a new hybrid model of SFNN and GRNN: Performance comparison with NDRAM, ART2 and MLP

The performance of the hybrid model is better than those of recurrent associative memory, feed-forward multilayer perceptron and is equally comparable with the performance of hard-competitive models.