Sampling using SU(N) gauge equivariant flows

  title={Sampling using SU(N) gauge equivariant flows},
  author={Denis Boyda and Gurtej Kanwar and S{\'e}bastien Racani{\`e}re and Danilo Jimenez Rezende and M. S. Albergo and Kyle Cranmer and Daniel C. Hackett and Phiala Shanahan},
We develop a flow-based sampling algorithm for $SU(N)$ lattice gauge theories that is gauge-invariant by construction. Our key contribution is constructing a class of flows on an $SU(N)$ variable (or on a $U(N)$ variable by a simple alternative) that respect matrix conjugation symmetry. We apply this technique to sample distributions of single $SU(N)$ variables and to construct flow-based samplers for $SU(2)$ and $SU(3)$ lattice gauge theory in two dimensions. 
Gauge covariant neural network for 4 dimensional non-abelian gauge theory
We develop a gauge covariant neural network for four dimensional non-abelian gauge theory, which realizes a map between rank-2 tensor valued vector fields. We find that the conventional smearingExpand
Self-learning Monte-Carlo for non-abelian gauge theory with dynamical fermions
In this paper, we develop the self-learning Monte-Carlo (SLMC) algorithm for non-abelian gauge theory with dynamical fermions in four dimensions to resolve the autocorrelation problem in lattice QCD.Expand
Normalizing flows and the real-time sign problem
Normalizing flows have recently been applied to the problem of accelerating Markov chains in lattice field theory. We propose a generalization of normalizing flows that allows them to applied toExpand
Temperature-steerable flows
Boltzmann generators approach the sampling problem in many-body physics by combining a normalizing flow and a statistical reweighting method to generate samples of a physical system's equilibriumExpand
Gauge Invariant Autoregressive Neural Networks for Quantum Lattice Models
A parallel version of the Celada–Seiden cellular automaton with real-time constraints is proposed that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive and therefore difficult to fabricate in the laboratory. Expand
Preserving Properties of Neural Networks by Perturbative Updates
Deep learning applications in physics usually require neural network architectures that obey certain symmetries and equivariances. Retaining such mathematical properties during training withExpand
Training Neural Networks with Property-Preserving Parameter Perturbations
This work presents a novel, general approach of preserving matrix properties by using parameterized perturbations in lieu of directly optimizing the network parameters, and shows how such invertible blocks improve the mixing of coupling layers and thus the mode separation of the resulting normalizing flows. Expand
Machine learning for sampling in lattice quantum field theory
Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, ∗ Gurtej Kanwar, Phiala E. Shanahan, † and Julian M. Urban Center for Cosmology and Particle Physics, New York University, New York,Expand
Machine Learning and Neural Networks for Field Theory
Perturbative acceleration [1–3]. The ability to efficiently sample from high-dimensional distributions remains a widely-pursued goal across scientific disciplines, with some noteable examplesExpand
Nuclear matrix elements from lattice QCD for electroweak and beyond-Standard-Model processes
Over the last decade, numerical solutions of Quantum Chromodynamics (QCD) using the technique of lattice QCD have developed to a point where they are beginning to connect fundamental aspects ofExpand


The Weyl Integration Formula
Let G be a compact, connected Lie group, and let T be a maximal torus. Theorem 16.5 implies that every conjugacy class meets T. Thus, we should be able to compute the Haar integral of a classExpand
Trivializations for Gradient-Based Optimization on Manifolds
This work proves a formula for the gradient of the exponential of matrices, which can be of practical interest on its own and shows how dynamic trivializations improve the performance of existing methods on standard tasks designed to test long-term memory within neural networks. Expand
Two Dimensional Lattice Gauge Theory with and Without Fermion Content
OF THE DISSERTATION TWO DIMENSIONAL LATTICE GAUGE THEORY WITH AND WITHOUT FERMION CONTENT by Dibakar Sigdel Florida International University, 2017 Miami, Florida Professor Rajamani Narayanan, MajorExpand
B-Spline CNNs on Lie Groups
A modular framework is proposed for the design and implementation of G-CNNs for arbitrary Lie groups that enables localized, atrous, and deformable convolutions in G- CNNs by means of respectively localized, sparse and non-uniform B-spline expansions. Expand
Cheap Orthogonal Constraints in Neural Networks: A Simple Parametrization of the Orthogonal and Unitary Group
A novel approach to perform first-order optimization with orthogonal and unitary constraints based on a parametrization stemming from Lie group theory through the exponential map is introduced, showing faster, accurate, and more stable convergence in several tasks designed to test RNNs. Expand
Lattice methods for quantum chromodynamics
Continuum QCD and Its Phenomenology Path Integration Renormalization and the Renormalization Group Yang-Mills Theory on the Lattice Fermions on the Lattice Numerical Methods for Bosons NumericalExpand
Auto-Encoding Variational Bayes
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
On Estimation of Thermodynamic Observables in Lattice Field Theories with Deep Generative Models
It is shown that generative models can be used to estimate the absolute value of the free energy, which is in contrast to existing MCMC-based methods, which are limited to only estimate free energy differences. Expand
  • 2020
A gauge redundancy-free formulation of compact QED with dynamical matter for quantum and classical computations
We introduce a way to express compact quantum electrodynamics with dynamical matter on two- and three-dimensional spatial lattices in a gauge redundancy-free manner while preserving translationalExpand