• Corpus ID: 239885740

Quantum machine learning beyond kernel methods

@article{Jerbi2021QuantumML,
  title={Quantum machine learning beyond kernel methods},
  author={Sofi{\`e}ne Jerbi and Lukas J. Fiderer and Hendrik Poulsen Nautrup and Jonas M. K{\"u}bler and Hans J. Briegel and Vedran Dunjko},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.13162}
}
Machine learning algorithms based on parametrized quantum circuits are a prime candidate for near-term applications on noisy quantum computers. Yet, our understanding of how these quantum machine learning models compare, both mutually and to classical models, remains limited. Previous works achieved important steps in this direction by showing a close connection between some of these quantum models and kernel methods, well-studied in classical machine learning. In this work, we identify the… 

Exponential concentration and untrainability in quantum kernel methods

This work shows that, under certain conditions, values of quantum kernels over different input data can be exponentially concentrated towards some value, leading to an exponential scaling of the number of measurements required for successful training.

Noisy quantum kernel machines

It is shown that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines and an upper bound on the generalization error of the model that involves the average purity of the encoded states is derived.

Parameterized Quantum Circuits with Quantum Kernels for Machine Learning: A Hybrid Quantum-Classical Approach

It is concluded that quantum kernels with hybrid kernel methods, a.k.a. quantum Kernel PQCs, offer distinct advantages as a hybrid approach to QML.

Concentration of Data Encoding in Parameterized Quantum Circuits

It is proved that, under reasonable assumptions, the distance between the average encoded state and the maximally mixed state could be explicitly upper-bounded with respect to the width and depth of the encoding circuit.

Generalization despite overfitting in quantum machine learning models

This work derives the behavior of a classical interpolating Fourier features models for regression on noisy signals, and shows how a class of quantum models exhibits analogous features, thereby linking the structure of quantum circuits to overparameterization and overfitting in quantum models.

Hyperparameter Importance of Quantum Neural Networks Across Small Datasets

This work applies the functional ANOVA framework to quantum neural networks to analyze which of the hyperparameters were most influential for their predictive performance, and introduces new methodologies to study quantum machine learning models and provides new insights toward quantum model selection.

Power and limitations of single-qubit native quantum neural networks

It is proved that single-qubit quantum neural networks can approximate any univariate function by mapping the model to a partial Fourier series, and the exact correlations between the parameters of the trainable gates and the working Fourier coefficients are established.

Provable Advantage in Quantum Phase Learning via Quantum Kernel Alphatron

It is proved that, under widely believed complexity theory assumptions, quantum phase learning problem cannot be efficiently solved by machine learning algorithms using classical resources and classical data, and it is proved the universality of quantum kernel Alphatron in efficiently predicting quantum phases.

Learning quantum processes without input control

A general statistical learning theory for processes that take as input a classical random variable and output a quantum state is introduced, and an algorithm for learning with high probability in this setting with a finite amount of samples, even if the concept class is infinite is provided.

The effect of the processing and measurement operators on the expressive power of quantum models

This work sketches the determinant role that the processing and measurement operators have on the expressive power of simple quantum circuits and observes that increasing the number of parameterized and entangling gates leads to a more expressive model for certain circuit structures.

References

SHOWING 1-10 OF 51 REFERENCES

Quantum machine learning models are kernel methods

It is shown that most near-term and fault-tolerant quantum models can be replaced by a general support vector machine whose kernel computes distances between data-encoding quantum states, and kernel-based training is guaranteed to find better or equally good quantum models than variational circuit training.

Quantum embeddings for machine learning

This work proposes to train the first part of the circuit with the objective of maximally separating data classes in Hilbert space, a strategy it calls quantum metric learning, which provides a powerful analytic framework for quantum machine learning.

Power of data in quantum machine learning

This work shows that some problems that are classically hard to compute can be easily predicted by classical machines learning from data and proposes a projected quantum model that provides a simple and rigorous quantum speed-up for a learning problem in the fault-tolerant regime.

Quantum Machine Learning in Feature Hilbert Spaces.

This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.

Large-scale quantum machine learning

This work measures quantum kernels using randomized measurements to gain a quadratic speedup in computation time and quickly process large datasets and efficiently encode high-dimensional data into quantum computers with the number of features scaling linearly with the circuit depth.

Universal Approximation Property of Quantum Machine Learning Models in Quantum-Enhanced Feature Spaces.

This work proves that the machine learning models induced from the quantum-enhanced feature space are universal approximators of continuous functions under typical quantum feature maps, and enables an important theoretical analysis to ensure that machine learning algorithms based on quantum feature Maps can handle a broad class of machine learning tasks.

Effect of data encoding on the expressive power of variational quantum-machine-learning models

It is shown that there exist quantum models which can realise all possible sets of Fourier coefficients, and therefore, if the accessible frequency spectrum is asymptotically rich enough, such models are universal function approximators.

Experimental kernel-based quantum machine learning in finite feature space

An all-optical setup demonstrating kernel-based quantum machine learning for two-dimensional classification problems using specialized multiphoton quantum optical circuits exhibits exponentially better scaling in the required number of qubits than a direct generalization of kernels described in the literature.

Encoding-dependent generalization bounds for parametrized quantum circuits

These results facilitate the selection of optimal data-encoding strategies via structural risk minimization, a mathematically rigorous framework for model selection, by bounding the complexity of PQC-based models as measured by the Rademacher complexity and the metric entropy, two complexity measures from statistical learning theory.

Differentiable Learning of Quantum Circuit Born Machine

This work devise an efficient gradient-based learning algorithm for the quantum circuit Born machine by minimizing the kerneled maximum mean discrepancy loss and simulated generative modeling of the Bars-and-Stripes dataset and Gaussian mixture distributions using deep quantum circuits.
...