# Quantum Tensor Networks, Stochastic Processes, and Weighted Automata

@inproceedings{Srinivasan2021QuantumTN, title={Quantum Tensor Networks, Stochastic Processes, and Weighted Automata}, author={Siddarth Srinivasan and Sandesh Adhikary and Jacob Miller and Guillaume Rabusseau and Byron Boots}, booktitle={AISTATS}, year={2021} }

Modeling joint probability distributions over sequences has been studied from many perspectives. The physics community developed matrix product states, a tensor-train decomposition for probabilistic modeling, motivated by the need to tractably model many-body systems. But similar models have also been studied in the stochastic processes and weighted automata literature, with little work on how these bodies of work relate to each other. We address this gap by showing how stationary or uniformâ€¦Â

## 4 Citations

Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework

- Computer ScienceArXiv
- 2021

A hybrid PGM-TN formalism is introduced that integrates quantum-like correlations into PGM models in a principled manner, using the physically-motivated concept of decoherence, and allows a broad family of probabilistic TN models to be encoded as partially decohered BMs.

Learning Circular Hidden Quantum Markov Models: A Tensor Network Approach

- Computer ScienceArXiv
- 2021

It is shown that c- HQMMs are equivalent to a constrained tensor network (more precisely, circular Local Purified State with positive-semidefinite decomposition) model, which enables us to provide an efficient learning model for c-HQMMs.

Lower and Upper Bounds on the Pseudo-Dimension of Tensor Network Models

- Computer Science
- 2021

Upper and lower bounds on the VC-dimension and pseudo-dimension of a large class of TN models for classification, regression and completion are derived and a generalization bound is derived which can be applied to classification with low-rank matrices as well as linear classifiers based on any of the commonly used tensor decomposition models.

Lower and Upper Bounds on the VC-Dimension of Tensor Network Models

- Computer ScienceArXiv
- 2021

Upper and lower bounds on the VC-dimension and pseudo-dimension of a large class of TN models for classification, regression and completion are derived and a generalization bound is derived which can be applied to classification with low-rank matrices as well as linear classifiers based on any of the commonly used tensor decomposition models.

## References

SHOWING 1-10 OF 73 REFERENCES

Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning

- Computer ScienceNeurIPS
- 2019

This work provides a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions, and introduces locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems with provably better expressive power than all other representations considered.

Links between multiplicity automata, observable operator models and predictive state representations: a unified learning framework

- Computer ScienceJ. Mach. Learn. Res.
- 2015

This work presents SMA, OOMs, and PSRs under the common framework of sequential systems, which are an algebraic characterization of multiplicity automata, and establishes a unified approach to learning such models from data.

Matrix-product operators and states: NP-hardness and undecidability.

- Computer SciencePhysical review letters
- 2014

It is shown that the problem of deciding whether a given matrix-product operator actually represents a physical state that in particular has no negative eigenvalues is provably undecidable in the thermodynamic limit and that the bounded version of the problem is NP-hard (nondeterministic-polynomial-time hard) in the system size.

Unsupervised Generative Modeling Using Matrix Product States

- Computer SciencePhysical Review X
- 2018

This work proposes a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states, and enjoys efficient learning analogous to the density matrix renormalization group method.

Hidden Quantum Markov Models and non-adaptive read-out of many-body states

- Physics
- 2010

Stochastic finite-state generators are compressed descriptions of infinite time series. Alternatively, compressed descriptions are given by quantum finite- state generators [K. Wiesner and J. P.â€¦

Quantum learning of classical stochastic processes: The completely positive realization problem

- Computer Science
- 2014

This work generalizes some key results of stochastic realization theory, and shows that the problem has deep connections with operator systems theory, giving possible insight to the lifting problem in quotient operator systems.

Learning and Inference in Hilbert Space with Quantum Graphical Models

- Computer ScienceNeurIPS
- 2018

Experimental results are presented showing that HSE-HQMMs are competitive with state-of-the-art models like LSTMs and PSRNNs on several datasets, while also providing a nonparametric method for maintaining a probability distribution over continuous-valued features.

Links between probabilistic automata and hidden Markov models: probability distributions, learning models and induction algorithms

- Computer SciencePattern Recognit.
- 2005

Finitely correlated states on quantum spin chains

- Mathematics
- 1992

We study a construction that yields a class of translation invariant states on quantum spin chains, characterized by the property that the correlations across any bond can be modeled on aâ€¦

Towards Quantum Machine Learning with Tensor Networks

- Computer ScienceQuantum Science and Technology
- 2019

This work proposes a unified framework in which classical and quantum computing can benefit from the same theoretical and algorithmic developments, and the same model can be trained classically then transferred to the quantum setting for additional optimization.