Corpus ID: 220424468

Reformulation of the No-Free-Lunch Theorem for Entangled Data Sets

@article{Sharma2020ReformulationOT,
  title={Reformulation of the No-Free-Lunch Theorem for Entangled Data Sets},
  author={Kunal Sharma and M. Cerezo and Z. Holmes and L. Cincio and A. Sornborger and Patrick J. Coles},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.04900}
}
The No-Free-Lunch (NFL) theorem is a celebrated result in learning theory that limits one's ability to learn a function with a training data set. With the recent rise of quantum machine learning, it is natural to ask whether there is a quantum analog of the NFL theorem, which would restrict a quantum computer's ability to learn a unitary process (the quantum analog of a function) with quantum training data. However, in the quantum setting, the training data can possess entanglement, a strong… Expand
Experimental Quantum Learning of a Spectral Decomposition
Currently available quantum hardware allows for small scale implementations of quantum machine learning algorithms. Such experiments aid the search for applications of quantum computers byExpand
Variational Hamiltonian Diagonalization for Dynamical Quantum Simulation
Dynamical quantum simulation may be one of the first applications to see quantum advantage. However, the circuit depth of standard Trotterization methods can rapidly exceed the coherence time ofExpand
Universal Adversarial Examples and Perturbations for Quantum Classifiers
TLDR
This paper proves that for a set of k classifiers with each receiving input data of n qubits, an O( ln k 2n ) increase of the perturbation strength is enough to ensure a moderate universal adversarial risk. Expand
Barren Plateaus Preclude Learning Scramblers.
TLDR
A no-go theorem for learning an unknown scrambling process with QML is proved, showing that it is highly probable for any variational Ansatz to have a barren plateau landscape, i.e., cost gradients that vanish exponentially in the system size. Expand
Absence of Barren Plateaus in Quantum Convolutional Neural Networks
TLDR
This work rigorously analyze the gradient scaling for the parameters in the QCNN architecture and finds that the variance of the gradient vanishes no faster than polynomially, implying that QCNNs do not exhibit barren plateaus. Expand
Variational Quantum Algorithms
TLDR
An overview of the field of Variational Quantum Algorithms is presented and strategies to overcome their challenges as well as the exciting prospects for using them as a means to obtain quantum advantage are discussed. Expand
Quantum machine learning of graph-structured data
Graph structures are ubiquitous throughout the natural sciences. Here we consider graph-structured quantum data and describe how to carry out its quantum machine learning via quantum neural networks.Expand
Entangled Datasets for Quantum Machine Learning
Louis Schatzki,1, 2, ∗ Andrew Arrasmith,1 Patrick J. Coles,1 and M. Cerezo1, 3, 4, † Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87545, USA Department of Electrical andExpand
Long-time simulations with high fidelity on quantum hardware
TLDR
This paper presents a probabilistic simulation of the response of the immune system to x-ray diffraction and shows clear patterns in the response to the proton-proton collision. Expand
Connecting ansatz expressibility to gradient magnitudes and barren plateaus
TLDR
This paper presents a probabilistic simulation of the response of the immune system to x-ray diffraction and shows clear patterns in response to the proton-proton collision. Expand
...
1
2
...

References

SHOWING 1-10 OF 65 REFERENCES
No Free Lunch for Quantum Machine Learning
TLDR
A lower bound on the quantum risk of a quantum learning algorithm trained via pairs of input and output states when averaged over training pairs and unitaries is found. Expand
Entanglement-assisted guessing of complementary measurement outcomes
Heisenberg's uncertainty principle implies that if one party (Alice) prepares a system and randomly measures one of two incompatible observables, then another party (Bob) cannot perfectly predict theExpand
Variational quantum state diagonalization
Variational hybrid quantum-classical algorithms are promising candidates for near-term implementation on quantum computers. In these algorithms, a quantum computer evaluates the cost of a gateExpand
Hybrid quantum-classical approach to correlated materials
TLDR
This work shows that by using a hybrid quantum-classical algorithm that incorporates the power of a small quantum computer into a framework of classical embedding algorithms, the electronic structure of complex correlated materials can be efficiently tackled using a quantum computer. Expand
Quantum assisted quantum compiling
TLDR
This work proposes a variational hybrid quantum-classical algorithm called quantum-assisted quantum compiling (QAQC), and presents both gradient-free and gradient-based approaches to minimizing the cost of this algorithm's cost. Expand
The quest for a Quantum Neural Network
TLDR
This article presents a systematic approach to QNN research, concentrating on Hopfield-type networks and the task of associative memory, and outlines the challenge of combining the nonlinear, dissipative dynamics of neural computing and the linear, unitary dynamics of quantum computing. Expand
General teleportation channel, singlet fraction and quasi-distillation
We prove a theorem on direct relation between the optimal fidelity $f_{max}$ of teleportation and the maximal singlet fraction $F_{max}$ attainable by means of trace-preserving LQCC action (localExpand
Barren plateaus in quantum neural network training landscapes
TLDR
It is shown that for a wide class of reasonable parameterized quantum circuits, the probability that the gradient along any reasonable direction is non-zero to some fixed precision is exponentially small as a function of the number of qubits. Expand
The theory of variational hybrid quantum-classical algorithms
Many quantumalgorithms have daunting resource requirements when compared towhat is available today. To address this discrepancy, a quantum-classical hybrid optimization scheme known as ‘theExpand
Trainability of Dissipative Perceptron-Based Quantum Neural Networks
TLDR
This work analyzes the gradient scaling (and hence the trainability) for a recently proposed architecture that is called dissipative QNNs (DQNNs), where the input qubits of each layer are discarded at the layer's output and finds that DQNN's can exhibit barren plateaus. Expand
...
1
2
3
4
5
...