## 3,582 Citations

### Clustered Boltzmann Machines: Massively Parallel Architectures for Constrained Optimization Problems

- Computer ScienceParallel Comput.
- 1993

### Boltzmann Machines

- Computer ScienceEncyclopedia of Machine Learning and Data Mining
- 2010

A Boltzmann machine is a network of symmetrically connected, neuron-like units that make stochastic decisions about whether to be on or off and its learning algorithm allows them to discover interesting features that represent complex regularities in the training data.

### Connectionist Architectures for Artificial Intelligence

- Computer ScienceComputer
- 1987

The authors concentrate here on connectionism's potential as a practical technology for building intelligent systems, and also some of the unsolved problems facing this approach.

### A dedicated massively parallel architecture for the Boltzmann machine

- Computer ScienceParallel Comput.
- 1992

### Stochastic arrays and learning networks.

- Computer Science
- 1988

The thesis concludes that all the networks described may potentially be generalised to simple variations of one standard probabilistic element utilising stochastic coding, whose properties resemble those of biological neurons.

### Stochastic Learning Networks and their Electronic Implementation

- Computer ScienceNIPS
- 1987

A family of learning algorithms that operate on a recurrent, symmetrically connected, neuromorphic network that, like the Boltzmann machine, settles in the presence of noise and a version of the supervised learning algorithm for a network with analog activation functions.

### Efficient learning in Boltzmann Machines using linear response theory*

- Computer Science
- 2017

A new approximate learning algorithm for Boltzmann Machines is presented, which is based on mean field theory and the linear response theorem, and shows good performance for networks up to 100 neurons.

### Cortical connections and parallel processing: Structure and function

- Biology, Computer ScienceBehavioral and Brain Sciences
- 1986

The hypothesis that an important part of the cortex can be modeled as a connectionist computer that is especially suited for parallel problem solving is explored.

### Dictionary Learning by Dynamical Neural Networks

- Computer ScienceArXiv
- 2018

It is shown that by combining ideas of top-down feedback and contrastive learning, a dynamical network for solving the l1-minimizing dictionary learning problem can be constructed, and the true gradients for learning are provably computable by individual neurons.

## References

SHOWING 1-10 OF 32 REFERENCES

### Massively Parallel Architectures for AI: NETL, Thistle, and Boltzmann Machines

- Computer ScienceAAAI
- 1983

This paper will attempt to isolate a number of basic computational tasks that an intelligent system must perform, and describe several families of massively parallel computing architectures and a new architecture, which is called the Boltzmann machine, whose abilities appear to include anumber of tasks that are inefficient or impossible on the other architectures.

### Neural networks and physical systems with emergent collective computational abilities.

- Computer ScienceProceedings of the National Academy of Sciences of the United States of America
- 1982

A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.

### Schema Selection and Stochastic Inference in Modular Environments

- Computer ScienceAAAI
- 1983

The concept of computational temperature is introduced and the system appears to display a dramatic tendency to interpret input, even if the evidence for any particular interpretation is very weak.

### Connectionist Models and Their Properties

- Computer ScienceCogn. Sci.
- 1982

Much of the progress in the fields constituting cognitive science has been based upon the use of explicit information processing models, almost exclusively patterned after conventional serial…

### Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

- PhysicsIEEE Transactions on Pattern Analysis and Machine Intelligence
- 1984

The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.

### PRINCIPLES OF NEURODYNAMICS. PERCEPTRONS AND THE THEORY OF BRAIN MECHANISMS

- Biology
- 1963

The background, basic sources of data, concepts, and methodology to be employed in the study of perceptrons are reviewed, and some of the notation to be used in later sections are presented.

### Intellectual issues in the history of artificial intelligence

- Art
- 1983

This paper sketches the history of artificial intelligence in terms of intellectual issues. These are the usually d ichotomous opposit ions that disciplines seem to generate for themselves in…

### Applications of the Monte Carlo Method in Statistical Physics

- Physics
- 1984

1. A Simple Introduction to Monte Carlo Simulation and Some Specialized Topics.- 2. Recent Developments in the Simulation of Classical Fluids.- 3. Monte Carlo Studies of Critical and Multicritical…

### Frank Rosenblatt: Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms

- Art
- 1986

Frank Rosenblatt’s intention with his book, according to his own introduction, is not just to describe a machine, the perceptron, but rather to put forward a theory. He formulates a series of…

### Information Theory and Statistics

- Psychology
- 1959

information theory and statistics. Book lovers, when you need a new book to read, find the book here. Never worry not to find what you need. Is the information theory and statistics your needed book…