A Learning Algorithm for Boltzmann Machines

@article{Ackley1985ALA,
  title={A Learning Algorithm for Boltzmann Machines},
  author={David H. Ackley and Geoffrey E. Hinton and Terrence J. Sejnowski},
  journal={Cogn. Sci.},
  year={1985},
  volume={9},
  pages={147-169}
}

Figures from this paper

Boltzmann Machines

A Boltzmann machine is a network of symmetrically connected, neuron-like units that make stochastic decisions about whether to be on or off and its learning algorithm allows them to discover interesting features that represent complex regularities in the training data.

Connectionist Architectures for Artificial Intelligence

The authors concentrate here on connectionism's potential as a practical technology for building intelligent systems, and also some of the unsolved problems facing this approach.

A dedicated massively parallel architecture for the Boltzmann machine

Stochastic arrays and learning networks.

The thesis concludes that all the networks described may potentially be generalised to simple variations of one standard probabilistic element utilising stochastic coding, whose properties resemble those of biological neurons.

Stochastic Learning Networks and their Electronic Implementation

A family of learning algorithms that operate on a recurrent, symmetrically connected, neuromorphic network that, like the Boltzmann machine, settles in the presence of noise and a version of the supervised learning algorithm for a network with analog activation functions.

Efficient learning in Boltzmann Machines using linear response theory*

A new approximate learning algorithm for Boltzmann Machines is presented, which is based on mean field theory and the linear response theorem, and shows good performance for networks up to 100 neurons.

Cortical connections and parallel processing: Structure and function

  • D. Ballard
  • Biology, Computer Science
    Behavioral and Brain Sciences
  • 1986
The hypothesis that an important part of the cortex can be modeled as a connectionist computer that is especially suited for parallel problem solving is explored.

Implementing Boltzmann Machines

Dictionary Learning by Dynamical Neural Networks

It is shown that by combining ideas of top-down feedback and contrastive learning, a dynamical network for solving the l1-minimizing dictionary learning problem can be constructed, and the true gradients for learning are provably computable by individual neurons.
...

References

SHOWING 1-10 OF 32 REFERENCES

Massively Parallel Architectures for AI: NETL, Thistle, and Boltzmann Machines

This paper will attempt to isolate a number of basic computational tasks that an intelligent system must perform, and describe several families of massively parallel computing architectures and a new architecture, which is called the Boltzmann machine, whose abilities appear to include anumber of tasks that are inefficient or impossible on the other architectures.

Neural networks and physical systems with emergent collective computational abilities.

  • J. Hopfield
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 1982
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.

Schema Selection and Stochastic Inference in Modular Environments

The concept of computational temperature is introduced and the system appears to display a dramatic tendency to interpret input, even if the evidence for any particular interpretation is very weak.

Connectionist Models and Their Properties

Much of the progress in the fields constituting cognitive science has been based upon the use of explicit information processing models, almost exclusively patterned after conventional serial

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

  • S. GemanD. Geman
  • Physics
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 1984
The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.

PRINCIPLES OF NEURODYNAMICS. PERCEPTRONS AND THE THEORY OF BRAIN MECHANISMS

The background, basic sources of data, concepts, and methodology to be employed in the study of perceptrons are reviewed, and some of the notation to be used in later sections are presented.

Intellectual issues in the history of artificial intelligence

This paper sketches the history of artificial intelligence in terms of intellectual issues. These are the usually d ichotomous opposit ions that disciplines seem to generate for themselves in

Applications of the Monte Carlo Method in Statistical Physics

1. A Simple Introduction to Monte Carlo Simulation and Some Specialized Topics.- 2. Recent Developments in the Simulation of Classical Fluids.- 3. Monte Carlo Studies of Critical and Multicritical

Frank Rosenblatt: Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms

Frank Rosenblatt’s intention with his book, according to his own introduction, is not just to describe a machine, the perceptron, but rather to put forward a theory. He formulates a series of

Information Theory and Statistics

information theory and statistics. Book lovers, when you need a new book to read, find the book here. Never worry not to find what you need. Is the information theory and statistics your needed book