Mapping distinct phase transitions to a neural network.

  title={Mapping distinct phase transitions to a neural network.},
  author={Dimitrios Bachtis and Gert Aarts and Biagio Lucini},
  journal={Physical review. E},
  volume={102 5-1},
We demonstrate, by means of a convolutional neural network, that the features learned in the two-dimensional Ising model are sufficiently universal to predict the structure of symmetry-breaking phase transitions in considered systems irrespective of the universality class, order, and the presence of discrete or continuous degrees of freedom. No prior knowledge about the existence of a phase transition is required in the target system and its entire parameter space can be scanned with multiple… 

Figures and Tables from this paper

Adding machine learning within Hamiltonians: Renormalization group transformations, symmetry breaking and restoration

A physical interpretation of machine learning functions is presented, opening up the possibility to control properties of statistical systems via the inclusion of these functions in Hamiltonians, including the predictive function of a neural network as a conjugate variable coupled to an external field within the Hamiltonian of a system.

On the generalizability of artificial neural networks in spin models

By using the task of identifying phase transitions in spin models, this work establishes a systematic generalizability such that simple ANNs trained with the two-dimensional ferromagnetic Ising model can be applied to the ferrom electromagnetic q -state Potts model in different dimensions.

Machine Learning in Nuclear Physics

Advances in machine learning methods provide tools that have broad applicability in scientific research. These techniques are being applied across the diversity of nuclear physics research topics,

Neural-Network Quantum States: A Systematic Review

the of many-body with a high-impact interdisciplinary of of recent seminal that in a large of research line of such of is the so-called Neural-Network Quantum States, a powerful variational

Colloquium : Machine learning in nuclear physics

Advances in machine learning methods provide tools that have broad applicability in scientific research. These techniques are being applied across the diversity of nuclear physics research topics,

Quantum field theories, Markov random fields and machine learning

How discretized Euclidean field theories, such as the ϕ 4 lattice field theory on a square lattice, are mathematically equivalent to Markov fields, a notable class of probabilistic graphical models with applications in a variety of research areas, including machine learning is discussed.

Quantitative analysis of phase transitions in two-dimensional XY models using persistent homology

A new way of computing the persistent homology of lattice spin model configurations and, by considering the fluctuations in the output of logistic regression and k-nearest neighbours models trained on persistence images, a methodology to extract estimates of the critical temperature and the critical exponent of the correlation length is developed.

Interpreting machine learning functions as physical observables

Gert Aarts,a,b,∗ Dimitrios Bachtis and Biagio Lucini Department of Physics, Swansea University, Swansea SA2 8PP, United Kingdom European Centre for Theoretical Studies in Nuclear Physics and Related

Machine learning with quantum field theories

This paper aims to demonstrate the efforts towards in-situ applicability of EMMARM, as to provide real-time information about the response of the immune system to x-ray diffraction.

Inverse Renormalization Group in Quantum Field Theory.

We propose inverse renormalization group transformations within the context of quantum field theory that produce the appropriate critical fixed point structure, give rise to inverse flows in



Parameter diagnostics of phases and phase transition learning by neural networks

An analysis of neural network-based machine learning schemes for phases and phase transitions in theoretical condensed matter research, focusing on neural networks with a single hidden layer, and demonstrates how the learning-by-confusing scheme can be used, in combination with a simple threshold-value classification method, to diagnose the learning parameters of neural networks.

Learning phase transitions by confusion

This work proposes a neural-network approach to finding phase transitions, based on the performance of a neural network after it is trained with data that are deliberately labelled incorrectly, and paves the way to the development of a generic tool for identifying unexplored phase transitions.

Unsupervised identification of the phase transition on the 2D-Ising model

We investigate deep learning autoencoders for the unsupervised recognition of phase transitions in physical systems formulated on a lattice. We use spin configurations produced for the 2-dimensional

Regressive and generative neural networks for scalar field theory

This work analyzes a broad range of chemical potentials and finds that the network is robust and able to recognize patterns far away from the point where it was trained, and elaborate on potential uses of such a generative approach for sampling outside the training region.

Extending machine learning classification capabilities with histogram reweighting

The approach treats the output from a convolutional neural network as an observable in a statistical system, enabling its extrapolation over continuous ranges in parameter space, and demonstrates the use of Monte Carlo histogram reweighting using the phase transition in the two-dimensional Ising model.

Learning phase transitions from dynamics

The use of recurrent neural networks for classifying phases of matter based on the dynamics of experimentally accessible observables is proposed by training recurrent networks on the magnetization traces of two distinct models of one-dimensional disordered and interacting spin chains.

Unveiling phase transitions with machine learning

It is shown how unsupervised learning can detect three phases (ferromagnetic, paramagnetic, and a cluster of the antiphase with the floating phase) as well as two distinct regions within the paramagnetic phase and it is shown that transfer learning becomes possible: a machine trained only with nearest-neighbour interactions can learn to identify a new type of phase occurring when next-nearest-neIGHbour interactions are introduced.

On the generalizability of artificial neural networks in spin models

It is suggested that nontrivial information of multiple-state systems can be encoded in a representation of far fewer states, and the amount of ANNs required in the study of spin models can potentially be reduced.

Machine Learning Phases of Strongly Correlated Fermions

This work shows that a three dimensional convolutional network trained on auxiliary field configurations produced by quantum Monte Carlo simulations of the Hubbard model can correctly predict the magnetic phase diagram of the model at the average density of one (half filling).

Machine Learning as a universal tool for quantitative investigations of phase transitions