Modeling Sequences with Quantum States: A Look Under the Hood

@article{Bradley2020ModelingSW,
  title={Modeling Sequences with Quantum States: A Look Under the Hood},
  author={Tai-Danae Bradley and Edwin Miles Stoudenmire and John Terilla},
  journal={Mach. Learn. Sci. Technol.},
  year={2020},
  volume={1},
  pages={35008}
}
Classical probability distributions on sets of sequences can be modeled using quantum states. Here, we do so with a quantum state that is pure and entangled. Because it is entangled, the reduced densities that describe subsystems also carry information about the complementary subsystem. This is in contrast to the classical marginal distributions on a subsystem in which information about the complementary system has been integrated out and lost. A training algorithm based on the density matrix… 

Figures from this paper

At the Interface of Algebra and Statistics
TLDR
This thesis takes inspiration from quantum physics to investigate mathematical structure that lies at the interface of algebra and statistics, and discusses a preliminary framework for modeling entailment and concept hierarchy in natural language by representing expressions in the language as densities.
Probabilistic Modeling with Matrix Product States
TLDR
An efficient training algorithm for a subset of classically simulable quantum circuit models, presented as a sequence of exactly solvable effective models, is a modification of the density matrix renormalization group procedure adapted for learning a probability distribution.
Classical versus quantum models in machine learning: insights from a finance application
TLDR
A comparison of the widely used classical ML models known as restricted Boltzmann machines (RBMs) against a recently proposed quantum model, now known as quantum circuit Born machines (QCBMs), finds that the quantum models seem to have superior performance on typical instances when compared with the canonical training of the RBMs.
SchrödingeRNN: Generative Modeling of Raw Audio as a Continuously Observed Quantum State
TLDR
SchrodingeRNN, a quantum inspired generative model for raw audio that takes the form of a stochastic Schrodinger equation describing the continuous time measurement of a quantum system, and is equivalent to the continuous Matrix Product State (cMPS) representation of wavefunctions in one dimensional many-body systems.
Machine learning for quantum matter
ABSTRACT Quantum matter, the research field studying phases of matter whose properties are intrinsically quantum mechanical, draws from areas as diverse as hard condensed matter physics, materials
Grammar-Aware Question-Answering on Quantum Computers
TLDR
This work performs the first implementation of an NLP task on noisy intermediate-scale quantum (NISQ) hardware and encodes word-meanings in quantum states and explicitly account for grammatical structure, which even in mainstream NLP is not commonplace, by faithfully hard-wiring it as entangling operations.
Enhancing Combinatorial Optimization with Quantum Generative Models
TLDR
This work introduces a new family of quantum-enhanced optimizers and demonstrates how quantum machine learning models known as quantum generative models can find lower minima than those found by means of stand-alone state-of-the-art classical solvers.
A Multi-Scale Tensor Network Architecture for Classification and Regression
TLDR
An algorithm for supervised learning using tensor networks, employing a step of preprocessing the data by coarse-graining through a sequence of wavelet transformations that can adaptively fine-grain the optimized MPS model backwards through the layers with essentially no loss in performance.
The ITensor Software Library for Tensor Network Calculations
TLDR
The philosophy behind ITensor, a system for programming tensor network calculations with an interface modeled on tensor diagram notation, and examples of each part of the interface including Index objects, the ITensor product operator, Tensor factorizations, tensor storage types, algorithms for matrix product state (MPS) and matrix product operator (MPO) tensor networks, and the NDTensors library are discussed.
Yao.jl: Extensible, Efficient Framework for Quantum Algorithm Design
TLDR
Yao, an extensible, efficient open-source framework for quantum algorithm design, is introduced, which achieves state-of-the-art performance in simulating small to intermediate-sized quantum circuits that are relevant to near-term applications.
...
...

References

SHOWING 1-10 OF 25 REFERENCES
Probabilistic Modeling with Matrix Product States
TLDR
An efficient training algorithm for a subset of classically simulable quantum circuit models, presented as a sequence of exactly solvable effective models, is a modification of the density matrix renormalization group procedure adapted for learning a probability distribution.
Matrix product operators for sequence-to-sequence learning
TLDR
A machine learning model is constructed in which matrix product operators are trained to implement sequence to sequence prediction, i.e. given a sequence at a time step, it allows one to predict the next sequence.
Machine learning by unitary tensor network of hierarchical tree structure
TLDR
This work trains two-dimensional hierarchical TNs to solve image recognition problems, using a training algorithm derived from the multi-scale entanglement renormalization ansatz, and introduces mathematical connections among quantum many-body physics, quantum information theory, and machine learning.
Unsupervised Generative Modeling Using Matrix Product States
TLDR
This work proposes a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states, and enjoys efficient learning analogous to the density matrix renormalization group method.
Shortcut Matrix Product States and its applications
TLDR
This work develops efficient training methods of SMPS, establishes some of their mathematical properties, and shows how to find a good location to add shortcuts, resulting in a new model Shortcut Matrix Product States (SMPS), which can decrease significantly the correlation length of the MPS, while preserving the computational efficiency.
Tree Tensor Networks for Generative Modeling
TLDR
It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits.
Equivalence of restricted Boltzmann machines and tensor network states
TLDR
This work builds a bridge between RBM and tensor network states (TNS) widely used in quantum many-body physics research, and devise efficient algorithms to translate an RBM into the commonly used TNS.
Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning
TLDR
This work provides a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions, and introduces locally purified states (LPS), a new factorization inspired by techniques for the simulation of quantum systems with provably better expressive power than all other representations considered.
...
...