Tangible reduction in learning sample complexity with large classical samples and small quantum system

@article{Song2021TangibleRI,
  title={Tangible reduction in learning sample complexity with large classical samples and small quantum system},
  author={Wooyeong Song and Marcin Wiesniak and Nana Liu and Marcin Pawłowski and Jin-Hyoung Lee and Jaewan Kim and Jeongho Bang},
  journal={Quantum Inf. Process.},
  year={2021},
  volume={20},
  pages={1-18}
}
Quantum computation requires large classical datasets to be embedded into quantum states in order to exploit quantum parallelism. However, this embedding requires considerable resources in general. It would therefore be desirable to avoid it, if possible, for noisy intermediate-scale quantum (NISQ) implementation. Accordingly, we consider a classical-quantum hybrid architecture, which allows large classical input data, with a relatively small-scale W. Song Department of Physics, Hanyang… 

References

SHOWING 1-10 OF 39 REFERENCES

Experimental demonstration of quantum learning speedup with classical input data

TLDR
This work considers quantum-classical hybrid machine learning in which large-scale input channels remain classical and small-scale working channels process quantum operations conditioned on classical input data, in contrast to recently developed approaches for quantum machine learning.

The theory of variational hybrid quantum-classical algorithms

TLDR
The concept of quantum variational error suppression that allows some errors to be suppressed naturally in this algorithm on a pre-threshold quantum device is introduced and the use of modern derivative free optimization techniques can offer dramatic computational savings of up to three orders of magnitude over previously used optimization techniques.

The theory of variational hybrid quantum-classical algorithms

TLDR
This work develops a variational adiabatic ansatz and explores unitary coupled cluster where it is shown how the use of modern derivative free optimization techniques can offer dramatic computational savings of up to three orders of magnitude over previously used optimization techniques.

Supervised learning with quantum-enhanced feature spaces

TLDR
Two classification algorithms that use the quantum state space to produce feature maps are demonstrated on a superconducting processor, enabling the solution of problems when the feature space is large and the kernel functions are computationally expensive to estimate.

Training of quantum circuits on a hybrid quantum computer

TLDR
This study trains generative modeling circuits on a quantum hybrid computer showing an optimization strategy and a resource trade-off and shows that the convergence of the quantum circuit to the target distribution depends critically on both the quantum hardware and classical optimization strategy.

Quantum learning robust against noise

TLDR
This work considers the problem of learning the class of $n$-bit parity functions by making queries to a quantum example oracle by showing that in the presence of noise the classical learning problem is believed to be intractable, while the quantum version remains efficient.

Quantum machine learning: a classical perspective

TLDR
The literature in quantum ML is reviewed and perspectives for a mixed readership of classical ML and quantum computation experts are discussed, with particular emphasis on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems.

Small quantum computers and large classical data sets

TLDR
This work introduces hybrid classical-quantum algorithms for problems involving a large classical data set X and a space of models Y such that a quantum computer has superposition access to Y but not X that yields quantum speedups for maximum likelihood estimation, Bayesian inference and saddle-point optimization.

Quantum supremacy using a programmable superconducting processor

TLDR
Quantum supremacy is demonstrated using a programmable superconducting processor known as Sycamore, taking approximately 200 seconds to sample one instance of a quantum circuit a million times, which would take a state-of-the-art supercomputer around ten thousand years to compute.

A quantum speedup in machine learning: finding an N-bit Boolean function for a classification

TLDR
It is shown that quantum superposition enables quantum learning that is faster than classical learning by expanding the approximate solution regions, i.e., the acceptable regions.