• Corpus ID: 208617436

Expressiveness and Learning of Hidden Quantum Markov Models

@article{Adhikary2020ExpressivenessAL,
  title={Expressiveness and Learning of Hidden Quantum Markov Models},
  author={Sandesh Adhikary and Siddarth Srinivasan and Geoffrey J. Gordon and Byron Boots},
  journal={ArXiv},
  year={2020},
  volume={abs/1912.02098}
}
Extending classical probabilistic reasoning using the quantum mechanical view of probability has been of recent interest, particularly in the development of hidden quantum Markov models (HQMMs) to model stochastic processes. However, there has been little progress in characterizing the expressiveness of such models and learning them from data. We tackle these problems by showing that HQMMs are a special subclass of the general class of observable operator models (OOMs) that do not suffer from… 

Figures and Tables from this paper

Learning Circular Hidden Quantum Markov Models: A Tensor Network Approach
TLDR
It is shown that c- HQMMs are equivalent to a constrained tensor network (more precisely, circular Local Purified State with positive-semidefinite decomposition) model, which enables us to provide an efficient learning model for c-HQMMs.
A quantum learning approach based on Hidden Markov Models for failure scenarios generation
TLDR
This paper will study and compare the results of HQMMs and classical Hidden Markov Models HMM on a real datasets generated from real small systems in the PSA, and give a strategy to identify the probable and no-probable failure scenarios of a system.
Quantum Tensor Networks, Stochastic Processes, and Weighted Automata
TLDR
This work shows how stationary or uniform versions of popular quantum tensor network models have equivalent representations in the stochastic processes and weighted automata literature, in the limit of infinitely long sequences.
Memory compression and thermal efficiency of quantum implementations of non-deterministic hidden Markov models
TLDR
This work provides a systematic prescription for constructing quantum implementations of non-deterministic HMMs that re-establish the quantum advantages against this broader class, and shows that whenever the classical implementation suffers from thermal dissipation due to its need to process information in a time-local manner, the quantum implementations will both mitigate some of this dissipation, and achieve an advantage in memory compression.

References

SHOWING 1-10 OF 44 REFERENCES
Learning Quantum Graphical Models using Constrained Gradient Descent on the Stiefel Manifold
TLDR
This paper solves the learning problem by solving a constrained optimization problem on the Stiefel manifold using a well-known retraction-based algorithm and demonstrates that this approach is not only faster and yields better solutions on several datasets, but also scales to larger models that were prohibitively slow to train via the earlier method.
Learning Hidden Quantum Markov Models
TLDR
This work presents a learning algorithm to estimate the parameters of an HQMM from data and shows that on HQMM generated data, the algorithm learns HQMMs with the same number of hidden states and predictive accuracy as the true HQM Ms, while HMMs learned with the Baum-Welch algorithm require more states to match the predictive accuracy.
Learning and Inference in Hilbert Space with Quantum Graphical Models
TLDR
Experimental results are presented showing that HSE-HQMMs are competitive with state-of-the-art models like LSTMs and PSRNNs on several datasets, while also providing a nonparametric method for maintaining a probability distribution over continuous-valued features.
Hidden Quantum Markov Models and Open Quantum Systems with Instantaneous Feedback
Hidden Markov Models are widely used in classical computer science to model stochastic processes with a wide range of applications. This paper concerns the quantum analogues of these machines —
Hidden Quantum Markov Models and non-adaptive read-out of many-body states
Stochastic finite-state generators are compressed descriptions of infinite time series. Alternatively, compressed descriptions are given by quantum finite- state generators [K. Wiesner and J. P.
A Probabilistic Graphical Model of Quantum Systems
  • C. Yeang
  • Computer Science
    2010 Ninth International Conference on Machine Learning and Applications
  • 2010
TLDR
This work proposes algorithms for three machine learning tasks in quantum probabilistic graphical models: a belief propagation algorithm for inference of unknown states, an iterative algorithm for simultaneous estimation of parameter values and hidden states, and an active learning algorithm to select measurement operators based on observed evidence.
An introduction to quantum machine learning
TLDR
This contribution gives a systematic overview of the emerging field of quantum machine learning and presents the approaches as well as technical details in an accessible way, and discusses the potential of a future theory of quantum learning.
Norm-Observable Operator Models
TLDR
A novel variant of OOMs is proposed, called norm-observable operator models (NOOMs), which avoid the NPP by design, and it is proved that NOOMs capture all Markov chain (MC) describable processes.
On Duality between Quantum Maps and Quantum States
TLDR
The concept of the dynamical matrix and the Jamiołkowski isomorphism are explored and an analogous relation is established between the classical maps and an extended space of the discrete probability distributions.
Tensor networks and graphical calculus for open quantum systems
TLDR
A graphical calculus for completely positive maps is described and the theory of open quantum systems and other fundamental primitives of quantum information theory using the language of tensor networks is reviewed.
...
...