Tree Tensor Networks for Generative Modeling

@article{Cheng2019TreeTN,
  title={Tree Tensor Networks for Generative Modeling},
  author={Song Cheng and Lei Wang and Tao Xiang and Pan Zhang},
  journal={ArXiv},
  year={2019},
  volume={abs/1901.02217}
}
Matrix product states (MPS), a tensor network designed for one-dimensional quantum systems, has been recently proposed for generative modeling of natural data (such as images) in terms of `Born machine. [...] Key Method We design the tree tensor network to utilize the 2-dimensional prior of the natural images and develop sweeping learning and sampling algorithms which can be efficiently implemented utilizing Graphical Processing Units (GPU).Expand
Tensor Networks for Language Modeling
TLDR
A uniform matrix product state (u-MPS) model for probabilistic modeling of sequence data that has the ability to condition or marginalize sampling on characters at arbitrary locations within a sequence, with no need for approximate sampling methods. Expand
Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning
TLDR
This work provides a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions, and introduces locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems with provably better expressive power than all other representations considered. Expand
Learning Phase Transition in Ising Model with Tensor-Network Born Machines
Learning underlying patterns in unlabeled data with generative models is a challenging task. Inspired by the probabilistic nature of quantum physics, recently, a new generative model known as BornExpand
Residual Matrix Product State for Machine Learning
TLDR
This work proposes the residual matrix product state (ResMPS), which can naturally incorporate with the non-linear activations and dropout layers, and outperforms the state-of-the-art TN models on the efficiency, stability, and expression power. Expand
Generative Tensor Network Classification Model for Supervised Machine Learning
TLDR
A generative TN classification (GTNC) approach for supervised learning that is more efficient than the existing TN models that are in general discriminative and an adaptive and universal model of excellent performance. Expand
Quantum-Classical Machine learning by Hybrid Tensor Networks
TLDR
This work proposes the quantum-classical hybrid tensor networks (HTN) which combine Tensor networks with classical neural networks in a uniform deep learning framework to overcome the limitations of regular tensor Networks in machine learning. Expand
A Multi-Scale Tensor Network Architecture for Classification and Regression
TLDR
An algorithm for supervised learning using tensor networks, employing a step of preprocessing the data by coarse-graining through a sequence of wavelet transformations that can adaptively fine-grain the optimized MPS model backwards through the layers with essentially no loss in performance. Expand
Differentiable Programming Tensor Networks
TLDR
This work presents essential techniques to differentiate through the tensor networks contractions, including stable AD for tensor decomposition and efficient backpropagation through fixed point iterations, and removes laborious human efforts in deriving and implementing analytical gradients for Tensor network programs. Expand
Tensor networks and efficient descriptions of classical data
TLDR
It is found that for text, the mutual information scales as a power law L with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks. Expand
Limitations of gradient-based Born Machines over tensor networks on learning quantum nonlocality
Nonlocality is an important constituent of quantum physics which lies at the heart of many striking features of quantum states such as entanglement. An important category of highly entangled quantumExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 42 REFERENCES
Unsupervised Generative Modeling Using Matrix Product States
TLDR
This work proposes a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states, and enjoys efficient learning analogous to the density matrix renormalization group method. Expand
Equivalence of restricted Boltzmann machines and tensor network states
The restricted Boltzmann machine (RBM) is one of the fundamental building blocks of deep learning. RBM finds wide applications in dimensional reduction, feature extraction, and recommender systemsExpand
Perfect Sampling with Unitary Tensor Networks
Tensor network states are powerful variational Ansatze for many-body ground states of quantum lattice models. The use of Monte Carlo sampling techniques in tensor network approaches significantlyExpand
Supervised Learning with Quantum-Inspired Tensor Networks
TLDR
It is demonstrated how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images. Expand
Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines
TLDR
It is found that the RBM with local sparse connection exhibit high learning efficiency, which supports the application of tensor network states in machine learning problems and estimates the classical mutual information of the standard MNIST datasets and the quantum Rényi entropy of corresponding Matrix Product States (MPS) representations. Expand
An exact mapping between the Variational Renormalization Group and Deep Learning
TLDR
This work constructs an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs), and suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data. Expand
Representational Power of Restricted Boltzmann Machines and Deep Belief Networks
TLDR
This work proves that adding hidden units yields strictly improved modeling power, while a second theorem shows that RBMs are universal approximators of discrete distributions and suggests a new and less greedy criterion for training RBMs within DBNs. Expand
Learning deep generative models
TLDR
The aim of the thesis is to demonstrate that deep generative models that contain many layers of latent variables and millions of parameters can be learned efficiently, and that the learned high-level feature representations can be successfully applied in a wide spectrum of application domains, including visual object recognition, information retrieval, and classification and regression tasks. Expand
Differentiable Learning of Quantum Circuit Born Machine
TLDR
This work devise an efficient gradient-based learning algorithm for the quantum circuit Born machine by minimizing the kerneled maximum mean discrepancy loss and simulated generative modeling of the Bars-and-Stripes dataset and Gaussian mixture distributions using deep quantum circuits. Expand
Variational Monte Carlo with the multiscale entanglement renormalization ansatz
Monte Carlo sampling techniques have been proposed as a strategy to reduce the computational cost of contractions in tensor network approaches to solving many-body systems. Here, we put forward aExpand
...
1
2
3
4
5
...