Stochastic gradient descent for hybrid quantum-classical optimization

@article{Sweke2020StochasticGD,
  title={Stochastic gradient descent for hybrid quantum-classical optimization},
  author={Ryan Sweke and Frederik Wilde and Johannes Jakob Meyer and Maria Schuld and Paul K. F{\"a}hrmann and Barth{\'e}l{\'e}my Meynard-Piganeau and Jens Eisert},
  journal={ArXiv},
  year={2020},
  volume={abs/1910.01155}
}
Within the context of hybrid quantum-classical optimization, gradient descent based optimizers typically require the evaluation of expectation values with respect to the outcome of parameterized quantum circuits. In this work, we explore the consequences of the prior observation that estimation of these quantities on quantum hardware results in a form of stochastic gradient descent optimization. We formalize this notion, which allows us to show that in many relevant cases, including VQE, QAOA… 

Figures from this paper

Noise-resilient variational hybrid quantum-classical optimization
TLDR
This work considers a minimization problem with respect to a variational state, iteratively obtained via a parametric quantum circuit, taking into account both the role of noise and the stochastic nature of quantum measurement outcomes, and shows the robustness of the algorithm against different noise strengths.
Large gradients via correlation in random parameterized quantum circuits
TLDR
It is proved that reducing the dimensionality of the parameter space by utilizing circuit modules containing spatially or temporally correlated gate layers can allow one to circumvent the vanishing gradient phenomenon.
Estimating the gradient and higher-order derivatives on quantum hardware
The authors show how to evaluate, with near-term quantum computers, high-order derivatives of expectation values with respect to the variational parameters of quantum circuits. The authors also study
Foundations for Bayesian inference with engineered likelihood functions for robust amplitude estimation
TLDR
This work shows how the ELF formalism enhances the rate of information gain in sampling as the physical hardware transitions from the regime of noisy intermediate-scale quantum computers into that of quantum error corrected ones.
On the learnability of quantum neural networks
TLDR
This paper derives the utility bounds of QNN towards empirical risk minimization, and shows that large gate noise, few quantum measurements, and deep circuit depth will lead to the poor utility bounds, and proves that QNN can be treated as a differentially private model.
Optimizing parametrized quantum circuits via noise-induced breaking of symmetries
TLDR
An optimization method called Symmetry-based Minima Hopping (SYMH), which exploits the underlying symmetries in PQCs to hop between local minima in the cost landscape and shows that SYMH improves the overall optimizer performance.
Space-efficient binary optimization for variational computing
TLDR
This paper shows that it is possible to greatly reduce the number of qubits needed for the Traveling Salesman problem, a paradigmatic optimization task, at the cost of having deeper variational circuits, and claims that the approach can be generalized for other problems where the standard bit-encoding is highly inefficient.
The power of quantum neural networks
TLDR
This work is the first to demonstrate that well-designed quantum neural networks offer an advantage over classical neural networks through a higher effective dimension and faster training ability, which is verified on real quantum hardware.
Equivalence of quantum barren plateaus to cost concentration and narrow gorges
TLDR
This work analytically proves the connection between three different landscape features that have been observed for PQCs: exponentially vanishing gradients, exponential cost concentration about the mean, and the exponential narrowness of minina.
Variational Quantum Linear Solver.
TLDR
A hybrid quantum-classical algorithm, called Variational Quantum Linear Solver (VQLS), for solving linear systems on near-term quantum computers, and derives an operationally meaningful termination condition that allows one to guarantee that a desired solution precision $\epsilon$ is achieved.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 68 REFERENCES
Evaluating analytic gradients on quantum hardware
TLDR
This paper shows how gradients of expectation values of quantum measurements can be estimated using the same, or almost the same the architecture that executes the original circuit, and proposes recipes for the computation of gradients for continuous-variable circuits.
Noise-resilient variational hybrid quantum-classical optimization
TLDR
This work considers a minimization problem with respect to a variational state, iteratively obtained via a parametric quantum circuit, taking into account both the role of noise and the stochastic nature of quantum measurement outcomes, and shows the robustness of the algorithm against different noise strengths.
Robust implementation of generative modeling with parametrized quantum circuits
TLDR
The gradient-free optimization algorithms show an outstanding performance compared to the gradient-based solver, and one of them had better performance when handling the unavoidable noisy objective function to be minimized under experimental conditions.
A Universal Training Algorithm for Quantum Deep Learning
We introduce the Backwards Quantum Propagation of Phase errors (Baqprop) principle, a central theme upon which we construct multiple universal optimization heuristics for training both parametrized
Optimizing quantum optimization algorithms via faster quantum gradient computation
TLDR
A quantum algorithm that computes the gradient of a multi-variate real-valued function by evaluating it at only a logarithmic number of points in superposition is developed, and it is shown that for low-degree multivariate polynomials the algorithm can provide exponential speedups compared to Jordan's algorithm in terms of the dimension $d.
Training of quantum circuits on a hybrid quantum computer
TLDR
This study trains generative modeling circuits on a quantum hybrid computer showing an optimization strategy and a resource trade-off and shows that the convergence of the quantum circuit to the target distribution depends critically on both the quantum hardware and classical optimization strategy.
Self-verifying variational quantum simulation of lattice models
TLDR
Experiments are presented that demonstrate self-verifying, hybrid, variational quantum simulation of lattice models in condensed matter and high-energy physics, enabling the study of a wide variety of previously intractable target models.
Architectures for quantum simulation showing a quantum speedup
TLDR
This work shows that benchmark settings exhibiting a quantum speedup may require little control in contrast to universal quantum computing, and proposes versatile and feasible schemes of two-dimensional dynamical quantum simulators showing such a quantumSpeedup.
PennyLane: Automatic differentiation of hybrid quantum-classical computations
TLDR
PennyLane's core feature is the ability to compute gradients of variational quantum circuits in a way that is compatible with classical techniques such as backpropagation, and it extends the automatic differentiation algorithms common in optimization and machine learning to include quantum and hybrid computations.
Quantum Approximate Optimization Algorithm: Performance, Mechanism, and Implementation on Near-Term Devices
TLDR
An in-depth study of the performance of QAOA on MaxCut problems is provided by developing an efficient parameter-optimization procedure and revealing its ability to exploit non-adiabatic operations, illustrating that optimization will be important only for problem sizes beyond numerical simulations, but accessible on near-term devices.
...
1
2
3
4
5
...