# Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework

@article{Miller2021ProbabilisticGM, title={Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework}, author={Jacob Miller and Geoffrey Roeder and Tai-Danae Bradley}, journal={ArXiv}, year={2021}, volume={abs/2106.15666} }

We investigate a correspondence between two formalisms for discrete probabilistic modeling: probabilistic graphical models (PGMs) and tensor networks (TNs), a powerful modeling framework for simulating complex quantum systems. The graphical calculus of PGMs and TNs exhibits many similarities, with discrete undirected graphical models (UGMs) being a special case of TNs. However, more general probabilistic TN models such as Born machines (BMs) employ complex-valued hidden states to produce novel…

## 3 Citations

### Decohering tensor network quantum machine learning models

- Computer Science, PhysicsQuantum Machine Intelligence
- 2023

Numerical evidence is presented that the fully decohered unitary tree tensor network (TTN) with two ancillas performs at least as well as the non-decoheredunitary TTN, suggesting that it is beneficial to add at least two an councillors to the unitaryTTN regardless of the amount of decoherence may be consequently introduced.

### Patch-based medical image segmentation using Quantum Tensor Networks

- Computer ScienceArXiv
- 2021

This work formulate image segmentation in a supervised setting with tensor networks by first lift the pixels in image patches to exponentially high dimensional feature spaces and using a linear decision hyper-plane to classify the input pixels into foreground and background classes.

### Patch-based Medical Image Segmentation using Matrix Product State Tensor Networks

- Computer ScienceMachine Learning for Biomedical Imaging
- 2022

This work formulate image segmentation in a supervised setting with tensor networks by first lift the pixels in image patches to exponentially high dimensional feature spaces and using a linear decision hyper-plane to classify the input pixels into foreground and background classes.

## 41 References

### Quantum Tensor Networks, Stochastic Processes, and Weighted Automata

- Computer ScienceAISTATS
- 2021

This work shows how stationary or uniform versions of popular quantum tensor network models have equivalent representations in the stochastic processes and weighted automata literature, in the limit of infinitely long sequences.

### Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning

- Computer ScienceNeurIPS
- 2019

This work provides a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions, and introduces locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems with provably better expressive power than all other representations considered.

### Tensor Networks for Probabilistic Sequence Modeling

- Computer ScienceAISTATS
- 2021

A novel generative algorithm is introduced giving trained u-MPS the ability to efficiently sample from a wide variety of conditional distributions, each one defined by a regular expression, which permits the generation of richly structured text in a manner that has no direct analogue in current generative models.

### Enhancing Generative Models via Quantum Correlations

- PhysicsPhysical Review X
- 2022

This paper presents a parallel version of the Celada–Seiden cellular automaton that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive way of simulating quantum mechanics.

### Unsupervised Generative Modeling Using Matrix Product States

- Computer SciencePhysical Review X
- 2018

This work proposes a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states, and enjoys efficient learning analogous to the density matrix renormalization group method.

### Perfect Sampling with Unitary Tensor Networks

- Computer Science
- 2012

This work proposes perfect sampling schemes, with vanishing equilibration and autocorrelation times, for unitary tensor networks, namely, unitary versions of the matrix product state and tree tensor network (TTN), and the multiscale entanglement renormalization Ansatz (MERA).

### Modeling sequences with quantum states: a look under the hood

- Computer ScienceMach. Learn. Sci. Technol.
- 2020

An understanding of the extra information contained in the reduced densities allow us to examine the mechanics of this DMRG algorithm and study the generalization error of the resulting model.

### Norm-Observable Operator Models

- MathematicsNeural Computation
- 2010

A novel variant of OOMs is proposed, called norm-observable operator models (NOOMs), which avoid the NPP by design, and it is proved that NOOMs capture all Markov chain (MC) describable processes.

### Duality of Graphical Models and Tensor Networks

- Computer Science, MathematicsInformation and Inference: A Journal of the IMA
- 2018

The duality between tensor networks and undirected graphical models with discrete variables is shown and it is shown that belief propagation corresponds to a known algorithm for tensor network contraction.