Lattice gauge equivariant convolutional neural networks

  title={Lattice gauge equivariant convolutional neural networks},
  author={Matteo Favoni and Andreas Ipp and David I. M{\"u}ller and Daniel Schuh},
  journal={Physical review letters},
  volume={128 3},
We propose lattice gauge equivariant convolutional neural networks (L-CNNs) for generic machine learning applications on lattice gauge theoretical problems. At the heart of this network structure is a novel convolutional layer that preserves gauge equivariance while forming arbitrarily shaped Wilson loops in successive bilinear layers. Together with topological information, for example, from Polyakov loops, such a network can, in principle, approximate any gauge covariant function on the… 
Lattice gauge symmetry in neural networks
We review a novel neural network architecture called lattice gauge equivariant convolutional neural networks (L-CNNs), which can be applied to generic machine learning problems in lattice gauge
Geometric Deep Learning and Equivariant Neural Networks
The mathematical foundations of geometric deep learning is surveyed, focusing on group equivariant and gaugeEquivariant neural networks and the use of Fourier analysis involving Wigner matrices, spherical harmonics and Clebsch–Gordan coefficients for G = SO(3), illustrating the power of representation theory for deep learning.
Homogeneous vector bundles and G-equivariant convolutional neural networks
It is demonstrated that homogeneous vector bundles is the natural setting for GCNNs, and reproducing kernel Hilbert spaces is used to obtain a precise criterion for expressing G-equivariant layers as convolutional layers.
Generalization capabilities of translationally equivariant neural networks
This work focuses on complex scalar field theory on a two-dimensional lattice and investigates the benefits of using group equivariant convolutional neural network architectures based on the translation group and demonstrates that in most of these tasks the best equivariants architectures can perform and generalize significantly better than their non-equivariant counterparts.
Generalization capabilities of neural networks in lattice applications
It is demonstrated that in most of these tasks the best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts, which applies not only to physical parameters beyond those represented in the training set, but also to different lattice sizes.
Gauge equivariant neural networks for quantum lattice gauge theories
Gauge equivariant neural-network quantum states are introduced, which exactly satisfy the local Hilbert space constraints necessary for the description of quantum lattice gauge theory with Zd gauge group on different geometries.
Applying machine learning methods to prediction problems of lattice observables
It is verified that the neural network constructs a gauge-invariant function and this property does not change over the entire range of the parameter space.
Deep Learning Hamiltonian Monte Carlo
The Hamiltonian Monte Carlo algorithm is generalized with a stack of neural network layers, and its ability to sample from different topologies in a twodimensional lattice gauge theory is evaluated.
Neural quantum states for supersymmetric quantum gauge theories
Supersymmetric quantum gauge theories are important mathematical tools in high energy physics. As an example, supersymmetric matrix models can be used as a holographic description of quantum black
A method to challenge symmetries in data with self-supervised learning
This work proposes a practical and general method to test suspect symmetries in data and handlestered data, which often arise from inefficiencies or deliberate selections, and which can give the illusion of asymmetry if mistreated.


Universality of Deep Convolutional Neural Networks
  • Ding-Xuan Zhou
  • Computer Science
    Applied and Computational Harmonic Analysis
  • 2020
Reducing autocorrelation times in lattice simulations with generative adversarial networks
This work works with a generative adversarial network (GAN) and proposes to address difficulties regarding its statistical exactness through the implementation of an overrelaxation step, by searching the latent space of the trained generator network.
Introduction to Quantum Fields on a Lattice
Preface 1. Introduction 2. Path integral and lattice regularisation 3. O(n) models 4. Gauge field on the lattice 5. U(1) and SU(n) gauge theory 6. Fermions on the lattice 7. Low mass hadrons in QCD
Implicit schemes for real-time lattice gauge theory
  • A. Ipp, D. Müller
  • Physics, Computer Science
    The European physical journal. C, Particles and fields
  • 2018
A new semi-implicit scheme is used to cure a numerical instability encountered in three-dimensional classical Yang-Mills simulations of heavy-ion collisions by allowing for wave propagation along one lattice direction free of numerical dispersion.
Comparison of topological charge definitions in Lattice QCD
In this paper, we show a comparison of different definitions of the topological charge on the lattice. We concentrate on one small-volume ensemble with 2 flavours of dynamical, maximally twisted mass
Adversarial Attacks and Defenses in Images, Graphs and Text: A Review
A systematic and comprehensive overview of the main threats of attacks and the success of corresponding countermeasures against adversarial examples, for three most popular data types, including images, graphs and text is reviewed.
Recent Developments in Gauge Theories
Almost all theories of fundamental interactions are nowadays based on the gauge concept. Starting with the historical example of quantum electrodynamics, we have been led to the successful unified
Quantum Chromodynamics on a Lattice
The phenomenological description of hadrons in terms of quarks continues to be successful; the most recent advance was the description of the new particles as built from charmed quarks. Mean-while