• Corpus ID: 231718907

Quantum machine learning models are kernel methods

@inproceedings{Schuld2021QuantumML,
  title={Quantum machine learning models are kernel methods},
  author={Maria Schuld},
  year={2021}
}
  • M. Schuld
  • Published 26 January 2021
  • Computer Science, Physics
With near-term quantum devices available and the race for fault-tolerant quantum computers in full swing, researchers became interested in the question of what happens if we replace a machine learning model with a quantum circuit. While such “quantum models” are sometimes called “quantum neural networks”, it has been repeatedly noted that their mathematical structure is actually much more closely related to kernel methods: they analyse data in high-dimensional Hilbert spaces to which we only… 

Figures and Tables from this paper

Quantum machine learning beyond kernel methods
TLDR
This work identifies the first unifying framework that captures all standard models based on parametrized quantum circuits: that oflinear quantum models, and shows how data re-uploading circuits, a generalization of linear models, can be efficiently mapped into equivalent linear quantum models.
Structural risk minimization for quantum linear classifiers
TLDR
This paper investigates capacity measures of two closely related QML models called explicit and implicit quantum linear classifiers with the objective of identifying new ways to implement structural risk minimization.
Universal Approximation Property of Quantum Machine Learning Models in Quantum-Enhanced Feature Spaces.
TLDR
This work proves that the machine learning models induced from the quantum-enhanced feature space are universal approximators of continuous functions under typical quantum feature maps, and enables an important theoretical analysis to ensure that machine learning algorithms based on quantum feature Maps can handle a broad class of machine learning tasks.
Bandwidth Enables Generalization in Quantum Kernel Models
TLDR
Evidence is provided that quantum machine learning methods can generalize well on challenging datasets, including those far outside of the theoretical assumptions.
Towards understanding the power of quantum kernels in the NISQ era
TLDR
This work proves that the advantage of quantum kernels is vanished for large size of datasets, few number of measurements, and large system noise and provides theoretical guidance of exploring advanced quantum kernels to attain quantum advantages on NISQ devices.
Large-scale quantum machine learning
TLDR
This work measures quantum kernels using randomized measurements to gain a quadratic speedup in computation time and quickly process large datasets and efficiently encode high-dimensional data into quantum computers with the number of features scaling linearly with the circuit depth.
The Inductive Bias of Quantum Kernels
TLDR
It is conjecture that quantum machine learning models can offer speed-ups only if they manage to encode knowledge about the problem at hand into quantum circuits, while encoding the same bias into a classical model would be hard.
Importance of Kernel Bandwidth in Quantum Machine Learning
TLDR
This work identifies the hyperparameter controlling the bandwidth of a quantum kernel and shows that it controls the expressivity of the resulting model, and indicates that optimizing the bandwidth can help mitigate the exponential decay of kernel values with qubit count.
Is quantum advantage the right goal for quantum machine learning?
A similar trend is found for usage of the term “neural network”. 2 Some of the earliest papers in quantum machine learning stem from research teams at Google [1] and D-Wave [2]. practically evaluate
Optimal quantum kernels for small data classification
TLDR
An algorithm for constructing quantum kernels for support vector machines that adapts quantum gate sequences to data and the performance of the resulting quantum models for classification problems with a small number of training points significantly exceeds that of optimized classical models with conventional kernels.
...
...

References

SHOWING 1-10 OF 36 REFERENCES
Quantum classifier with tailored quantum kernel
TLDR
A distance-based quantum classifier whose kernel is based on the quantum state fidelity between training and test data, and it is shown that the classifier is equivalent to measuring the expectation value of a Helstrom operator, from which the well-known optimal quantum state discrimination can be derived.
Power of data in quantum machine learning
TLDR
This work shows that some problems that are classically hard to compute can be easily predicted by classical machines learning from data and proposes a projected quantum model that provides a simple and rigorous quantum speed-up for a learning problem in the fault-tolerant regime.
Quantum embeddings for machine learning
TLDR
This work proposes to train the first part of the circuit with the objective of maximally separating data classes in Hilbert space, a strategy it calls quantum metric learning, which provides a powerful analytic framework for quantum machine learning.
A rigorous and robust quantum speed-up in supervised machine learning
TLDR
A rigorous quantum speed-up for supervised classification using a quantum learning algorithm that only requires classical access to data and achieves high accuracy, robust against additive errors in the kernel entries that arise from finite sampling statistics.
Quantum Machine Learning in Feature Hilbert Spaces.
TLDR
This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.
Effect of data encoding on the expressive power of variational quantum-machine-learning models
TLDR
It is shown that there exist quantum models which can realise all possible sets of Fourier coefficients, and therefore, if the accessible frequency spectrum is asymptotically rich enough, such models are universal function approximators.
Supervised learning with quantum-enhanced feature spaces
TLDR
Two classification algorithms that use the quantum state space to produce feature maps are demonstrated on a superconducting processor, enabling the solution of problems when the feature space is large and the kernel functions are computationally expensive to estimate.
A generative modeling approach for benchmarking and training shallow quantum circuits
TLDR
A quantum circuit learning algorithm that can be used to assist the characterization of quantum devices and to train shallow circuits for generative tasks is proposed and it is demonstrated that this approach can learn an optimal preparation of the Greenberger-Horne-Zeilinger states.
Barren plateaus in quantum neural network training landscapes
TLDR
It is shown that for a wide class of reasonable parameterized quantum circuits, the probability that the gradient along any reasonable direction is non-zero to some fixed precision is exponentially small as a function of the number of qubits.
Differentiable Learning of Quantum Circuit Born Machine
TLDR
This work devise an efficient gradient-based learning algorithm for the quantum circuit Born machine by minimizing the kerneled maximum mean discrepancy loss and simulated generative modeling of the Bars-and-Stripes dataset and Gaussian mixture distributions using deep quantum circuits.
...
...