# From Probabilistic Graphical Models to Generalized Tensor Networks for Supervised Learning

@article{Glasser2018FromPG, title={From Probabilistic Graphical Models to Generalized Tensor Networks for Supervised Learning}, author={Ivan Glasser and Nicola Pancotti and Juan Ignacio Cirac}, journal={IEEE Access}, year={2018}, volume={8}, pages={68169-68182} }

Tensor networks have found a wide use in a variety of applications in physics and computer science, recently leading to both theoretical insights as well as practical algorithms in machine learning. In this work we explore the connection between tensor networks and probabilistic graphical models, and show that it motivates the definition of generalized tensor networks where information from a tensor can be copied and reused in other parts of the network. We discuss the relationship between…

## 49 Citations

### Adaptive Tensor Learning with Tensor Networks

- Computer ScienceArXiv
- 2020

A generic and efficient adaptive algorithm for tensor learning based on a simple greedy approach optimizing a differentiable loss function starting from a rank one tensor and successively identifying the most promising tensor network edges for small rank increments.

### Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning

- Computer ScienceNeurIPS
- 2019

This work provides a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions, and introduces locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems with provably better expressive power than all other representations considered.

### Presence and Absence of Barren Plateaus in Tensor-Network Based Machine Learning

- Computer SciencePhysical Review Letters
- 2022

This work rigorously proves that barren plateaus prevail in the training process of the machine learning algorithms with global loss functions, and reveals a crucial aspect of tensor-network based machine learning in a rigorous fashion.

### Tensor Networks for Probabilistic Sequence Modeling

- Computer ScienceAISTATS
- 2021

A novel generative algorithm is introduced giving trained u-MPS the ability to efficiently sample from a wide variety of conditional distributions, each one defined by a regular expression, which permits the generation of richly structured text in a manner that has no direct analogue in current generative models.

### Tensor networks and efficient descriptions of classical data

- Computer ScienceArXiv
- 2021

It is found that for text, the mutual information scales as a power law L with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks.

### Deep convolutional tensor network

- Computer ScienceArXiv
- 2020

A novel Tensor network based model called Deep convolutional tensor network (DCTN) for image classification, which has parameter sharing, locality, and deepness, and it is based on the Entangled plaquette states (EPS) tensornetwork.

### Lower and Upper Bounds on the Pseudo-Dimension of Tensor Network Models

- Computer ScienceNeurIPS
- 2021

Upper and lower bounds on the VC-dimension and pseudo-dimension of a large class of TN models for classification, regression and completion are derived and a generalization bound is derived which can be applied to classification with low-rank matrices as well as linear classifiers based on any of the commonly used tensor decomposition models.

### Differentiable programming of isometric tensor networks

- Computer ScienceMach. Learn. Sci. Technol.
- 2022

By introducing several gradient-based optimization methods for the isometric tensor network and comparing with Evenbly–Vidal method, it is shown that auto-differentiation has a better performance for both stability and accuracy.

### Residual Matrix Product State for Machine Learning

- Computer ScienceArXiv
- 2020

This work proposes the residual matrix product state (ResMPS), which can naturally incorporate with the non-linear activations and dropout layers, and outperforms the state-of-the-art TN models on the efficiency, stability, and expression power.

### Generative machine learning with tensor networks: Benchmarks on near-term quantum computers

- Computer Science
- 2020

This work lays out a framework for designing and optimizing TN-based QAML models using classical techniques, and demonstrates greedy heuristics for compiling with a given topology and gate set that outperforms known generic methods in terms of the number of entangling gates.

## References

SHOWING 1-10 OF 66 REFERENCES

### Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning

- Computer ScienceNeurIPS
- 2019

This work provides a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions, and introduces locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems with provably better expressive power than all other representations considered.

### On the Expressive Power of Deep Learning: A Tensor Analysis

- Computer ScienceCOLT 2016
- 2015

It is proved that besides a negligible set, all functions that can be implemented by a deep network of polynomial size, require exponential size in order to be realized (or even approximated) by a shallow network.

### Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives

- Computer ScienceFound. Trends Mach. Learn.
- 2017

This monograph builds on Tensor Networks for Dimensionality Reduction and Large-scale Optimization by discussing tensor network models for super-compressed higher-order representation of data/parameters and cost functions, together with an outline of their applications in machine learning and data analytics.

### Equivalence of restricted Boltzmann machines and tensor network states

- Computer Science
- 2018

This work builds a bridge between RBM and tensor network states (TNS) widely used in quantum many-body physics research, and devise efficient algorithms to translate an RBM into the commonly used TNS.

### Supervised Learning with Tensor Networks

- Computer ScienceNIPS
- 2016

It is demonstrated how algorithms for optimizing tensor networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize non-linear kernel learning models.

### Towards quantum machine learning with tensor networks

- Computer ScienceQuantum Science and Technology
- 2019

A unified framework is proposed in which classical and quantum computing can benefit from the same theoretical and algorithmic developments, and the same model can be trained classically then transferred to the quantum setting for additional optimization.

### Analysis and Design of Convolutional Networks via Hierarchical Tensor Decompositions

- Computer ScienceArXiv
- 2017

This paper overviews a series of works written by the authors, that through an equivalence to hierarchical tensor decompositions, analyze the expressive efficiency and inductive bias of various convolutional network architectural features (depth, width, strides and more).

### Categorical Tensor Network States

- Computer ScienceArXiv
- 2010

This work presents a new and general method to factor an n-body quantum state into a tensor network of clearly defined building blocks and uses the solution to expose a previously unknown and large class of quantum states which can be sampled efficiently and exactly.

### Quantum Entanglement in Deep Learning Architectures.

- Computer SciencePhysical review letters
- 2019

The results show that contemporary deep learning architectures, in the form of deep convolutional and recurrent networks, can efficiently represent highly entangled quantum systems and can support volume-law entanglement scaling, polynomially more efficiently than presently employed RBMs.

### Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines

- Computer ScienceEntropy
- 2018

It is found that the RBM with local sparse connection exhibit high learning efficiency, which supports the application of tensor network states in machine learning problems and estimates the classical mutual information of the standard MNIST datasets and the quantum Rényi entropy of corresponding Matrix Product States (MPS) representations.