Tensor networks for unsupervised machine learning
@article{Liu2021TensorNF, title={Tensor networks for unsupervised machine learning}, author={Jing Liu and Sujie Li and Jiang Zhang and Pan Zhang}, journal={Physical review. E}, year={2021}, volume={107 1}, pages={ L012103 } }
Modeling the joint distribution of high-dimensional data is a central task in unsupervised machine learning. In recent years, many interests have been attracted to developing learning models based on tensor networks, which have the advantages of a principle understanding of the expressive power using entanglement properties, and as a bridge connecting classical computation and quantum computation. Despite the great potential, however, existing tensor network models for unsupervised machineβ¦Β
Figures and Tables from this paper
9 Citations
Generative modeling with projected entangled-pair states
- Computer ScienceArXiv
- 2022
Techniques from many-body physics have always played a major role in the development of generative machine learning, and can be traced back to the parallels between the respective problems one has to deal with in both fields.
Generalization and Overfitting in Matrix Product State Machine Learning Architectures
- Computer ScienceArXiv
- 2022
It is speculated that generalization properties of MPS depend on the properties of data: with one-dimensional data (for which the MPS ansatz is the most suitable) MPS is prone to overο¬tting, while with more complex data which cannot be parameterized by MPS exactly, over-tting may be much less signiο¬cant.
Deep tensor networks with matrix product operators
- Computer Science
- 2022
Deep tensor networks are introduced, which are exponentially wide neural networks based on the tensor network representation of the weight matrices and random crop training improves the robustness of uniform Tensor network models to image size and aspect ratio changes.
Graphical calculus for Tensor Network Contractions
- Computer Science, Physics
- 2022
This dissertation investigates how effective the existing procedures are at enhancing tensor network contractions and proposes new strategies based on their observations, which are evaluated using a variety of circuits, including the Sycamore circuits used by Google to demonstrate quantum supremacy in 2019.
Permutation Search of Tensor Network Structures via Local Sampling
- Computer ScienceICML
- 2022
Theoretically, the counting and metric properties of search spaces of TN-PS are proved and a novel meta-heuristic algorithm is proposed, in which the searching is done by randomly sampling in a neighborhood established in the authors' theory, and then recurrently updating the neighborhood until convergence.
Grokking phase transitions in learning local rules with gradient descent
- Computer ScienceArXiv
- 2022
A tensor-network map is introduced that connects the proposed grokking setup with the standard (perceptron) statistical learning theory and it is shown thatGrokking is a consequence of the locality of the teacher model and the critical exponent and thegrokking time distributions are numerically determined.
A Practical Guide to the Numerical Implementation of Tensor Networks I: Contractions, Decompositions, and Gauge Freedom
- Computer ScienceFrontiers in Applied Mathematics and Statistics
- 2022
An introduction to the contraction of tensor networks, to optimal tensor decompositions, and to the manipulation of gauge degrees of freedom in Tensor networks is presented.
Control flow in active inference systems
- Computer Science
- 2023
It is shown here that when systems are described as executing active inference driven by the free-energy principle (and hence can be considered Bayesian prediction-error minimizers), their control flow systems can always be represented as tensor networks (TNs).
Deep tensor networks with matrix product operators
- Computer ScienceQuantum Machine Intelligence
- 2022
Deep tensor networks are introduced, which are exponentially wide neural networks based on the tensor network representation of the weight matrices and it is shown that the latter generalises well to different input sizes.
References
SHOWING 1-10 OF 48 REFERENCES
Unsupervised Generative Modeling Using Matrix Product States
- Computer SciencePhysical Review X
- 2018
This work proposes a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states, and enjoys efficient learning analogous to the density matrix renormalization group method.
Tree Tensor Networks for Generative Modeling
- Computer SciencePhysical Review B
- 2019
It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits.
Machine learning by unitary tensor network of hierarchical tree structure
- Computer Science, PhysicsNew Journal of Physics
- 2019
This work trains two-dimensional hierarchical TNs to solve image recognition problems, using a training algorithm derived from the multi-scale entanglement renormalization ansatz, and introduces mathematical connections among quantum many-body physics, quantum information theory, and machine learning.
Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines
- Computer ScienceEntropy
- 2018
It is found that the RBM with local sparse connection exhibit high learning efficiency, which supports the application of tensor network states in machine learning problems and estimates the classical mutual information of the standard MNIST datasets and the quantum RΓ©nyi entropy of corresponding Matrix Product States (MPS) representations.
Solving Statistical Mechanics using Variational Autoregressive Networks
- Computer SciencePhysical review letters
- 2019
This work proposes a general framework for solving statistical mechanics of systems with finite size using autoregressive neural networks, which computes variational free energy, estimates physical quantities such as entropy, magnetizations and correlations, and generates uncorrelated samples all at once.
Deep autoregressive models for the efficient variational simulation of many-body quantum systems
- Computer SciencePhysical review letters
- 2020
This work proposes a specialized neural- network architecture that supports efficient and exact sampling, completely circumventing the need for Markov-chain sampling, and demonstrates the ability to obtain accurate results on larger system sizes than those currently accessible to neural-network quantum states.
The Neural Autoregressive Distribution Estimator
- Computer ScienceAISTATS
- 2011
A new approach for modeling the distribution of high-dimensional vectors of discrete variables inspired by the restricted Boltzmann machine, which outperforms other multivariate binary distribution estimators on several datasets and performs similarly to a large (but intractable) RBM.
On the quantitative analysis of deep belief networks
- Computer ScienceICML '08
- 2008
It is shown that Annealed Importance Sampling (AIS) can be used to efficiently estimate the partition function of an RBM, and a novel AIS scheme for comparing RBM's with different architectures is presented.
Solving statistical mechanics on sparse graphs with feedback-set variational autoregressive networks.
- Computer SciencePhysical review. E
- 2021
The method extracts a small feedback vertex set from the sparse graph, converts the sparse system to a much smaller system with many-body and dense interactions with an effective energy on every configuration of the FVS, then learns a variational distribution parametrized using neural networks to approximate the original Boltzmann distribution.
Solving the quantum many-body problem with artificial neural networks
- Computer Science, PhysicsScience
- 2017
A variational representation of quantum states based on artificial neural networks with a variable number of hidden neurons and a reinforcement-learning scheme that is capable of both finding the ground state and describing the unitary time evolution of complex interacting quantum systems.