• Corpus ID: 201070845

Quantum Expectation-Maximization for Gaussian Mixture Models

@article{Kerenidis2020QuantumEF,
  title={Quantum Expectation-Maximization for Gaussian Mixture Models},
  author={Iordanis Kerenidis and Alessandro Luongo and Anupam Prakash},
  journal={ArXiv},
  year={2020},
  volume={abs/1908.06657}
}
The Expectation-Maximization (EM) algorithm is a fundamental tool in unsupervised machine learning. It is often used as an efficient way to solve Maximum Likelihood (ML) estimation problems, especially for models with latent variables. It is also the algorithm of choice to fit mixture models: generative models that represent unlabelled points originating from $k$ different processes, as samples from $k$ multivariate distributions. In this work we define and use a quantum version of EM to fit a… 

Tables from this paper

Quantum machine learning with subspace states
TLDR
A new approach for quantum linear algebra based on quantum subspace states is introduced and three new quantum machine learning algorithms are presented that reduce exponentially the depth of circuits used in quantum topological data analysis from O(n) to O(log n).
Quantum Perceptron Revisited: Computational-Statistical Tradeoffs
TLDR
This paper introduces a hybrid quantum-classical perceptron algorithm with lower complexity and better generalization ability than the classical perceptron, and derives a bound on the expected error of the hypothesis returned by the algorithm.
Large-sample properties of unsupervised estimation of the linear discriminant using projection pursuit
We study the estimation of the linear discriminant with projection pursuit, a method that is unsupervised in the sense that it does not use the class labels in the estimation. Our viewpoint is
Prospects and challenges of quantum finance
TLDR
Three potential applications of quantum computing to finance are described, starting with the state-of-the-art and focusing in particular on recent works by the QC Ware team, which consider quantum speedups for Monte Carlo methods, portfolio optimization, and machine learning.
GENQU: A HYBRID SYSTEM FOR LEARNING CLASSI-
  • Computer Science
  • 2020
TLDR
GenQu is proposed, a hybrid and general-purpose quantum framework for learning classical data through quantum states and demonstrates that, comparing with classical solutions, the proposed models running on GenQu framework achieve similar accuracy with a much smaller number of qubits, while significantly reducing the parameter size.
A Hybrid System for Learning Classical Data in Quantum States
TLDR
This paper proposes GenQu, a hybrid and general-purpose quantum framework for learning classical data through quantum states, and demonstrates that, compared with classical solutions, the proposed models running on GenQu framework achieve similar accuracy with a much smaller number of qubits, while significantly reducing the parameter size.
The prospects of quantum computing in computational molecular biology
TLDR
The aim of this review is to introduce the promise and limitations of emerging quantum computing technologies in the areas of computational molecular biology and bioinformatics.
Using Mixture of Normal Distributions to Detect Treatment Effects when the Frequentist Method Fails
TLDR
The use of MLM adds value because it can be used to understand the disease experience or the value of treatment when traditional statistical methods cannot.
Quantum computing at the quantum advantage threshold: a down-to-business review
A.K. Fedorov, 2, 3 N. Gisin, 5 S.M. Beloussov, and A.I. Lvovsky 6 Schaffhausen Institute of Technology, Schaffhausen 8200, Switzerland Russian Quantum Center, Skolkovo, Moscow 143025, Russia National

References

SHOWING 1-10 OF 77 REFERENCES
QUANTUM EXPECTATION MAXIMIZATION FOR GAUSSIAN MIX-
  • Computer Science
  • 2019
TLDR
This work defines and uses a quantum version of EM to fit a Gaussian Mixture Model, and discusses the performance of the algorithm on datasets that are expected to be classified successfully by those algorithms, arguing that on those cases it can give strong guarantees on the runtime.
Settling the Polynomial Learnability of Mixtures of Gaussians
  • Ankur Moitra, G. Valiant
  • Computer Science
    2010 IEEE 51st Annual Symposium on Foundations of Computer Science
  • 2010
TLDR
This paper gives the first polynomial time algorithm for proper density estimation for mixtures of k Gaussians that needs no assumptions on the mixture, and proves that such a dependence is necessary.
Simple Methods for Initializing the EM Algorithm for Gaussian Mixture Models
TLDR
New initialization methods based on the well-known K-means++ algorithm and the Gonzalez algorithm are presented, which close the gap between simple uniform initialization techniques and complex methods, that have been specifically designed for Gaussian mixture models and depend on the right choice of hyperparameters.
Disentangling Gaussians
TLDR
The conclusion is that the statistical complexity and computational complexity of this general problem is in every way polynomial except for the dependence on the number of Gaussians, which is necessarily exponential.
A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models
TLDR
The abstract form of the EM algorithm as it is often given in the literature is described and the EM parameter estimation procedure is developed for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2) finding a hidden Markov model (HMM) for both discrete and Gaussian mixture observation models.
q-means: A quantum algorithm for unsupervised machine learning
TLDR
The q-means algorithm is introduced, a new quantum algorithm for clustering which is a canonical problem in unsupervised machine learning and provides substantial savings compared to the classical $k$-mean algorithm that runs in time $O(kdN)$ per iteration, particularly for the case of large datasets.
Quantum Machine Learning
TLDR
This review focuses on the supervised classification quantum algorithm of nearest centroid, presented in [11], which helps to overcome the main bottleneck of the algorithm: calculation of the distances between the vectors in highly dimensional space.
A quantum-inspired classical algorithm for recommendation systems
  • Ewin Tang
  • Computer Science
    Electron. Colloquium Comput. Complex.
  • 2018
TLDR
A classical analogue to Kerenidis and Prakash’s quantum recommendation system is given, previously believed to be one of the strongest candidates for provably exponential speedups in quantum machine learning, which produces recommendations exponentially faster than previous classical systems, which run in time linear in m and n.
Parameter Estimation in Gaussian Mixture Models with Malicious Noise, without Balanced Mixing Coefficients
  • J. Xu, Jakub Marecek
  • Computer Science
    2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2018
TLDR
A robust algorithm is presented to estimate the parameters of a noisy 2-Gaussian Mixture Model without balanced weights, where the noise is of an arbitrary distribution, and outperforms the vanilla Expectation-Maximisation (EM) algorithm by orders of magnitude in terms of estimation error.
...
1
2
3
4
5
...