Learning Thermodynamics with Boltzmann Machines

  title={Learning Thermodynamics with Boltzmann Machines},
  author={G. Torlai and R. Melko},
A Boltzmann machine is a stochastic neural network that has been extensively used in the layers of deep architectures for modern machine learning applications. In this paper, we develop a Boltzmann machine that is capable of modeling thermodynamic observables for physical systems in thermal equilibrium. Through unsupervised learning, we train the Boltzmann machine on data sets constructed with spin configurations importance sampled from the partition function of an Ising Hamiltonian at… Expand
Super-resolving the Ising model with convolutional neural networks
Machine learning is becoming widely used in condensed matter physics. Inspired by the concept of image super-resolution, we propose a method to increase the size of lattice spin configurations usingExpand
Generative training of quantum Boltzmann machines with hidden units
This article provides a method for fully quantum generative training of quantum Boltzmann machines with both visible and hidden units while using quantum relative entropy as an objective, and presents two novel methods for solving this problem. Expand
Machine Learning Phases of Strongly Correlated Fermions
Machine learning offers an unprecedented perspective for the problem of classifying phases in condensed matter physics. We employ neural-network machine learning techniques to distinguishExpand
The Accuracy of Restricted Boltzmann Machine Models of Ising Systems
This article examines in detail the influence of hyperparameters such as the learning rate, the number of hidden nodes and the form of the threshold function on Ising spin system calculations. Expand
A cautionary tale for machine learning generated configurations in presence of a conserved quantity
This work investigates the performance of machine learning algorithms trained exclusively with configurations obtained from importance sampling Monte Carlo simulations of the two-dimensional Ising model with conserved magnetization and finds that restricted Boltzmann machines generate configurations with magnetizations and energies forbidden in the original physical system. Expand
Deep Learning the Ising Model Near Criticality
After training the generative networks, it is observed that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type, evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality. Expand
Generating the conformational properties of a polymer by the restricted Boltzmann machine.
It is shown that with adequate training data and network size, this method can capture the underlying polymer physics simply from learning the statistics in the training data without explicit information on the physical model itself. Expand
Identifying Product Order with Restricted Boltzmann Machines
Unsupervised machine learning via a restricted Boltzmann machine is an useful tool in distinguishing an ordered phase from a disordered phase. Here we study its application on the two-dimensionalExpand
Machine learning determination of dynamical parameters:: The Ising model case
A closed form expression for extracting the values of couplings, for every $n$-point interaction between the visible nodes of an RBM, in a binary system such as the Ising model is presented. Expand
Learnability scaling of quantum states: Restricted Boltzmann machines
This work empirically study the scaling of restricted Boltzmann machines (RBMs) applied to reconstruct ground-state wavefunctions of the one-dimensional transverse-field Ising model from projective measurement data and finds that the number of weights can be significantly reduced while still retaining an accurate reconstruction. Expand


Neural Computation
Lecture Notes for the MSc/DTC module. The brain is a complex computing machine which has evolved to give the ttest output to a given input. Neural computation has as goal to describe the function ofExpand
A Simple Weight Decay Can Improve Generalization
It is proven that a weight decay has two effects in a linear network, and it is shown how to extend these results to networks with hidden layers and non-linear units. Expand
Generative versus discriminative training of RBMs for classification of fMRI images
It is shown that much better discrimination can be achieved by fitting a generative model to each separate condition and then seeing which model is most likely to have generated the data. Expand
In Advances in Neural Information Processing Systems
Bill Baird { Publications References 1] B. Baird. Bifurcation analysis of oscillating neural network model of pattern recognition in the rabbit olfactory bulb. In D. 3] B. Baird. Bifurcation analysisExpand
IEEE Signal Processing Magazine Vol. 17
This index covers all technical items - papers, correspondence, reviews, etc. that appeared in this periodical during 2000, and items from previous years that were commented upon or corrected inExpand
Nature Communications
  • I. Ial
  • Medicine
  • Nature Cell Biology
  • 2010
Peer review, at its best, should aim to provide authors and editors with rigorous and constructive feedback resulting in an improved study, and there is clearly room for improvement, the current system is not broken. Expand
We would like to thank Mrs. Mirella Ferrari who assisted our patient during her conventional rehabilitation.
induces MEPs of 50 mV peak-to-peak amplitude in the targetsmuscle in 50% of the trials [3]. The effect of the treatment was assessed by means of Neuropathy Pain Scale [4], the Visual Analogue ScaleExpand
Nature Physics
  • Nature Physics
  • 2016
Journal of Machine Learning Research
  • Journal of Machine Learning Research
  • 2014