Analyzing the barren plateau phenomenon in training quantum neural network with the ZX-calculus

@article{Zhao2021AnalyzingTB,
  title={Analyzing the barren plateau phenomenon in training quantum neural network with the ZX-calculus},
  author={Chen Zhao and Xiao-Shan Gao},
  journal={Quantum},
  year={2021},
  volume={5},
  pages={466}
}
In this paper, we propose a general scheme to analyze the gradient vanishing phenomenon, also known as the barren plateau phenomenon, in training quantum neural networks with the ZX-calculus. More precisely, we extend the barren plateaus theorem from unitary 2-design circuits to any parameterized quantum circuits under certain reasonable assumptions. The main technical contribution of this paper is representing certain integrations as ZX-diagrams and computing them with the ZX-calculus. The… 

Figures from this paper

Diagrammatic Differentiation for Quantum Machine Learning
TLDR
This work introduces diagrammatic differentiation for tensor calculus by generalising the dual number construction from rigs to monoidal categories and extending the method to the automatic differentation of hybrid classical-quantum circuits.
A semi-agnostic ansatz with variable structure for quantum machine learning
TLDR
This paper presents a meta-analyses of the Fisica Teorica: Informacio i Fenomens Quantics, a large-scale probabilistic simulation of the response of the immune system to quantum entanglement.
Variational inference with a quantum computer
TLDR
This work uses quantum Born machines as variational distributions over discrete variables and adopts two specific realizations: one with an adversarial objective and one based on the kernelized Stein discrepancy to enable efficient variational inference with distributions beyond those that are efficiently representable on a classical computer.
Diagnosing barren plateaus with tools from quantum optimal control
Martín Larocca,1, 2 Piotr Czarnik,2 Kunal Sharma,3, 2 Gopikrishnan Muraleedharan,2 Patrick J. Coles,2 and M. Cerezo2, 4 Departamento de Física “J. J. Giambiagi” and IFIBA, FCEyN, Universidad de
Variational Power of Quantum Circuit Tensor Networks
Reza Haghshenas,1, ∗ Johnnie Gray,1, † Andrew C. Potter,2, 3 and Garnet Kin-Lic Chan1, ‡ Division of Chemistry and Chemical Engineering, California Institute of Technology, Pasadena, California
Diagrammatic Analysis for Parameterized Quantum Circuits
TLDR
Extensions of the ZX-calculus especially suitable for parameterized quantum circuits, in particular for computing observable expectation values as functions of or for parameters, which are important algorithmic quantities in a variety of applications ranging from combinatorial optimization to quantum chemistry are described.
QDNN: deep neural networks with quantum layers
TLDR
It is proved that the QDNN can uniformly approximate any continuous function and has more representation power than the classical DNN.
Classical Splitting of Parametrized Quantum Circuits
Cenk Tüysüz, 2, ∗ Giuseppe Clemente, Arianna Crippa, 2 Tobias Hartung, 4 Stefan Kühn, and Karl Jansen Deutsches Elektronen-Synchrotron (DESY), Platanenallee 6, 15738 Zeuthen, Germany Institüt für
Unsupervised quantum machine learning for fraud detection
TLDR
This work develops quantum protocols for anomaly detection and applies them to the task of credit card fraud detection, and observes that quantum FD can challenge equivalent classical protocols at increasing number of features (equal to the number of qubits for data embedding).
...
...

References

SHOWING 1-10 OF 58 REFERENCES
Absence of Barren Plateaus in Quantum Convolutional Neural Networks
TLDR
This work rigorously analyze the gradient scaling for the parameters in the QCNN architecture and finds that the variance of the gradient vanishes no faster than polynomially, implying that QCNNs do not exhibit barren plateaus.
Toward Trainability of Quantum Neural Networks.
TLDR
It is proved that QNNs with tree tensor architectures have gradients that vanish polynomially with the qubit number, and this result holds irrespective of which encoding methods are employed.
Entanglement Induced Barren Plateaus
TLDR
It is shown that for any bounded objective function on the visible layers, the Lipshitz constants of the expectation value of that objective function will scale inversely with the dimension of the hidden-subsystem with high probability, and how this can cause both gradient descent and gradient-free methods to fail.
Barren plateaus in quantum neural network training landscapes
TLDR
It is shown that for a wide class of reasonable parameterized quantum circuits, the probability that the gradient along any reasonable direction is non-zero to some fixed precision is exponentially small as a function of the number of qubits.
Noise-induced barren plateaus in variational quantum algorithms
TLDR
This work rigorously proves a serious limitation for noisy VQAs, in that the noise causes the training landscape to have a barren plateau, and proves that the gradient vanishes exponentially in the number of qubits n if the depth of the ansatz grows linearly with n.
Trainability of Dissipative Perceptron-Based Quantum Neural Networks
TLDR
This work represents the first rigorous analysis of the scalability of a perceptron-based QNN and provides quantitative bounds on the scaling of the gradient for DQNNs under different conditions, such as different cost functions and circuit depths.
Cost-Function-Dependent Barren Plateaus in Shallow Quantum Neural Networks
TLDR
Two results are rigorously proved that establish a connection between locality and trainability in VQAs and illustrate these ideas with large-scale simulations of a particular VQA known as quantum autoencoders.
Entanglement devised barren plateau mitigation
TLDR
This work defines barren plateaus in terms of random entanglement and proposes and demonstrates a number of barren plateau ameliorating techniques, including initial partitioning of cost function and non-cost function registers, meta-learning of lowentanglement circuit initializations, selective inter-register interaction, entanglements regularization, and rotation into preferred cost function eigenbases.
QDNN: DNN with Quantum Neural Network Layers
TLDR
This paper introduces a general quantum DNN, which consists of fully quantum structured layers with better representation power than the classical DNN and still keeps the advantages of the Classical DNN such as the non-linear activation, the multi-layer structure, and the efficient backpropagation training algorithm.
Diagrammatic Design and Study of Ansätze for Quantum Machine Learning
TLDR
This thesis pioneers the use of diagrammatic techniques to reason with QML ansatze, taking commonly used QML Ansatze circuits and converting them to diagrammatic form and giving a full description of how these gates commute, making the circuits much easier to analyse and simplify.
...
...