Analyzing the barren plateau phenomenon in training quantum neural network with the ZX-calculus

  title={Analyzing the barren plateau phenomenon in training quantum neural network with the ZX-calculus},
  author={Chen Zhao and Xiao-Shan Gao},
In this paper, we propose a general scheme to analyze the gradient vanishing phenomenon, also known as the barren plateau phenomenon, in training quantum neural networks with the ZX-calculus. More precisely, we extend the barren plateaus theorem from unitary 2-design circuits to any parameterized quantum circuits under certain reasonable assumptions. The main technical contribution of this paper is representing certain integrations as ZX-diagrams and computing them with the ZX-calculus. The… 

Figures from this paper

Diagrammatic Differentiation for Quantum Machine Learning

This work introduces diagrammatic differentiation for tensor calculus by generalising the dual number construction from rigs to monoidal categories and extending the method to the automatic differentation of hybrid classical-quantum circuits.

A semi-agnostic ansatz with variable structure for quantum machine learning

This paper presents a meta-analyses of the Fisica Teorica: Informacio i Fenomens Quantics, a large-scale probabilistic simulation of the response of the immune system to quantum entanglement.

Variational inference with a quantum computer

This work uses quantum Born machines as variational distributions over discrete variables and adopts two specific realizations: one with an adversarial objective and one based on the kernelized Stein discrepancy to enable efficient variational inference with distributions beyond those that are efficiently representable on a classical computer.

Diagnosing Barren Plateaus with Tools from Quantum Optimal Control

Variational Quantum Algorithms (VQAs) have received considerable attention due to their potential for achieving near-term quantum advantage. However, more work is needed to understand their

Variational Power of Quantum Circuit Tensor Networks

Reza Haghshenas,1, ∗ Johnnie Gray,1, † Andrew C. Potter,2, 3 and Garnet Kin-Lic Chan1, ‡ Division of Chemistry and Chemical Engineering, California Institute of Technology, Pasadena, California

Barren plateaus in quantum tensor network optimization

We analyze the barren plateau phenomenon in the variational optimization of quantum circuits inspired by matrix product states (qMPS), tree tensor networks (qTTN), and the multiscale entanglement

Diagrammatic Analysis for Parameterized Quantum Circuits

Extensions of the ZX-calculus especially suitable for parameterized quantum circuits, in particular for computing observable expectation values as functions of or for parameters, which are important algorithmic quantities in a variety of applications ranging from combinatorial optimization to quantum chemistry are described.

QDNN: deep neural networks with quantum layers

It is proved that the QDNN can uniformly approximate any continuous function and has more representation power than the classical DNN.

Classical Splitting of Parametrized Quantum Circuits

Barren plateaus appear to be a major obstacle to using variational quantum algorithms to simulate large-scale quantum systems or replace traditional machine learning algorithms. They can be caused by



Absence of Barren Plateaus in Quantum Convolutional Neural Networks

This work rigorously analyze the gradient scaling for the parameters in the QCNN architecture and finds that the variance of the gradient vanishes no faster than polynomially, implying that QCNNs do not exhibit barren plateaus.

Toward Trainability of Quantum Neural Networks.

It is proved that QNNs with tree tensor architectures have gradients that vanish polynomially with the qubit number, and this result holds irrespective of which encoding methods are employed.

Entanglement Induced Barren Plateaus

It is shown that for any bounded objective function on the visible layers, the Lipshitz constants of the expectation value of that objective function will scale inversely with the dimension of the hidden-subsystem with high probability, and how this can cause both gradient descent and gradient-free methods to fail.

Barren plateaus in quantum neural network training landscapes

It is shown that for a wide class of reasonable parameterized quantum circuits, the probability that the gradient along any reasonable direction is non-zero to some fixed precision is exponentially small as a function of the number of qubits.

Noise-induced barren plateaus in variational quantum algorithms

This work rigorously proves a serious limitation for noisy VQAs, in that the noise causes the training landscape to have a barren plateau, and proves that the gradient vanishes exponentially in the number of qubits n if the depth of the ansatz grows linearly with n.

Trainability of Dissipative Perceptron-Based Quantum Neural Networks

This work represents the first rigorous analysis of the scalability of a perceptron-based QNN and provides quantitative bounds on the scaling of the gradient for DQNNs under different conditions, such as different cost functions and circuit depths.

Cost-Function-Dependent Barren Plateaus in Shallow Quantum Neural Networks

Two results are rigorously proved that establish a connection between locality and trainability in VQAs and illustrate these ideas with large-scale simulations of a particular VQA known as quantum autoencoders.

Entanglement devised barren plateau mitigation

This work defines barren plateaus in terms of random entanglement and proposes and demonstrates a number of barren plateau ameliorating techniques, including initial partitioning of cost function and non-cost function registers, meta-learning of lowentanglement circuit initializations, selective inter-register interaction, entanglements regularization, and rotation into preferred cost function eigenbases.

QDNN: DNN with Quantum Neural Network Layers

This paper introduces a general quantum DNN, which consists of fully quantum structured layers with better representation power than the classical DNN and still keeps the advantages of the Classical DNN such as the non-linear activation, the multi-layer structure, and the efficient backpropagation training algorithm.

Diagrammatic Design and Study of Ansätze for Quantum Machine Learning

This thesis pioneers the use of diagrammatic techniques to reason with QML ansatze, taking commonly used QML Ansatze circuits and converting them to diagrammatic form and giving a full description of how these gates commute, making the circuits much easier to analyse and simplify.