Machine learning for percolation utilizing auxiliary Ising variables.

  title={Machine learning for percolation utilizing auxiliary Ising variables.},
  author={Junyi Zhang and Boqiang Zhang and Junyi Xu and Wanzhou Zhang and Youjin Deng},
  journal={Physical review. E},
  volume={105 2-1},
Machine learning for phase transition has received intensive research interest in recent years. However, its application in percolation still remains challenging. We propose an auxiliary Ising mapping method for the machine learning study of the standard percolation as well as a variety of statistical mechanical systems in correlated percolation representation. We demonstrate that unsupervised machine learning is able to accurately locate the percolation threshold, independent of the spatial… 

Figures from this paper

Machine learning the 2D percolation model
We use deep-learning strategies to study the 2D percolation model on a square lattice. We employ standard image recognition tools with a multi-layered convolutional neural network. We test how well
Neural network topological snake models for locating general phase diagrams
Machine learning for locating phase diagram has received intensive research interest in recent years. However, its application in automatically locating phase diagram is limited to single closed


Unsupervised machine learning account of magnetic transitions in the Hubbard model.
We employ several unsupervised machine learning techniques, including autoencoders, random trees embedding, and t-distributed stochastic neighboring ensemble (t-SNE), to reduce the dimensionality of,
Machine Learning Percolation Model
The findings indicate that the effectiveness of machine learning still needs to be evaluated in the applications of phase transitions and critical phenomena.
Machine learning of phase transitions in the percolation and XY models.
It is found that using just one hidden layer in a fully connected neural network, the percolation transition can be learned and the data collapse by using the average output layer gives correct estimate of the critical exponent ν.
High-precision percolation thresholds and Potts-model critical manifolds from graph polynomials
The critical curves of the q-state Potts model can be determined exactly for regular two-dimensional lattices G that are of the three-terminal type. Jacobsen and Scullard have defined a graph
Berezinskii-Kosterlitz-Thouless-like percolation transitions in the two-dimensional XY model.
Analysis of the correlation function g(p)(r), defined as the probability that two sites separated by a distance r belong to the same percolation cluster, yields algebraic decay for K≥K(c)(J), and the associated critical exponent depends on J and K.
Learning phase transitions by confusion
This work proposes a neural-network approach to finding phase transitions, based on the performance of a neural network after it is trained with data that are deliberately labelled incorrectly, and paves the way to the development of a generic tool for identifying unexplored phase transitions.
A New Strategy in Applying the Learning Machine to Study Phase Transitions.
In this Letter, we present a new strategy for applying the learning machine to study phase transitions. We train the learning machine with samples only obtained at a non-critical parameter point,
Identifying topological order through unsupervised machine learning
An unsupervised machine learning algorithm that identifies topological order is demonstrated and is shown to be capable of classifying samples of the two-dimensional XY model by winding number and capture the Berezinskii–Kosterlitz–Thouless transition.
Parameter diagnostics of phases and phase transition learning by neural networks
An analysis of neural network-based machine learning schemes for phases and phase transitions in theoretical condensed matter research, focusing on neural networks with a single hidden layer, and demonstrates how the learning-by-confusing scheme can be used, in combination with a simple threshold-value classification method, to diagnose the learning parameters of neural networks.