Corpus ID: 231648295

Introduction to Normalizing Flows for Lattice Field Theory

  title={Introduction to Normalizing Flows for Lattice Field Theory},
  author={M. S. Albergo and Denis Boyda and Daniel C. Hackett and Gurtej Kanwar and Kyle Cranmer and S{\'e}bastien Racani{\`e}re and Danilo Jimenez Rezende and Phiala Shanahan},
Michael S. Albergo, ∗ Denis Boyda, 3, 4, † Daniel C. Hackett, 4, ‡ Gurtej Kanwar, 4, § Kyle Cranmer, Sébastien Racanière, Danilo Jimenez Rezende, and Phiala E. Shanahan 4 Center for Cosmology and Particle Physics, New York University, New York, NY 10003, USA Argonne Leadership Computing Facility, Argonne National Laboratory, Lemont IL 60439, USA Center for Theoretical Physics, Massachusetts Institute of Technology, Cambridge, MA 02139, USA The NSF AI Institute for Artificial Intelligence and… Expand

Figures from this paper

Scaling Up Machine Learning For Quantum Field Theory with Equivariant Continuous Flows
A continuous normalizing flow for sampling from the high-dimensional probability distributions of Quantum Field Theories in Physics is proposed, based on a shallow design and incorporates the symmetries of the problem, and systematically outperforms a realNVP baseline in sampling efficiency. Expand
Efficient Modelling of Trivializing Maps for Lattice $\phi^4$ Theory Using Normalizing Flows: A First Look at Scalability
General-purpose Markov Chain Monte Carlo sampling algorithms suffer from a dramatic reduction in efficiency as the system being studied is driven towards a critical point through, for example, takingExpand
Gauge covariant neural network for 4 dimensional non-abelian gauge theory
We develop a gauge covariant neural network for four dimensional non-abelian gauge theory, which realizes a map between rank-2 tensor valued vector fields. We find that the conventional smearingExpand
Normalizing flows for random fields in cosmology
Normalizing flows are a powerful tool to create flexible probability distributions with a wide range of potential applications in cosmology. Here we are studying normalizing flows which representExpand
Deep generative modeling for probabilistic forecasting in power systems
Through comprehensive empirical evaluations using the open data of the Global Energy Forecasting Competition 2014, it is demonstrated that the methodology is competitive with other state-of-the-art deep learning generative models: generative adversarial networks and variational autoencoders. Expand
A Probabilistic Forecast-Driven Strategy for a Risk-Aware Participation in the Capacity Firming Market
This paper addresses the energy management of a grid-connected renewable generation plant coupled with a battery energy storage device in the capacity firming market, designed to promote renewable power generation facilities in small noninterconnected grids with a probabilistic forecast-driven strategy. Expand
A deep generative model for probabilistic energy forecasting in power systems: normalizing flows
This paper proposes to present to the power systems forecasting practitioners a recent deep learning technique, the normalizing flows, to produce accurate scenario-based probabilistic forecasts that are crucial to face the new challenges in power systems applications. Expand


Gauge equivariant neural networks for quantum lattice gauge theories
Gauge equivariant neural-network quantum states are introduced, which exactly satisfy the local Hilbert space constraints necessary for the description of quantum lattice gauge theory with Zd gauge group on different geometries. Expand
Introduction to Quantum Fields on a Lattice
Preface 1. Introduction 2. Path integral and lattice regularisation 3. O(n) models 4. Gauge field on the lattice 5. U(1) and SU(n) gauge theory 6. Fermions on the lattice 7. Low mass hadrons in QCDExpand
Temperature-steerable flows
Boltzmann generators approach the sampling problem in many-body physics by combining a normalizing flow and a statistical reweighting method to generate samples of a physical system's equilibriumExpand
Quantum Chromodynamics on a Lattice
The phenomenological description of hadrons in terms of quarks continues to be successful; the most recent advance was the description of the new particles as built from charmed quarks. Mean-whileExpand
Neural Ordinary Differential Equations on Manifolds
It is shown how vector fields provide a general framework for parameterizing a flexible class of invertible mapping on these spaces and it is illustrated how gradient based learning can be performed. Expand
Lattice gauge equivariant convolutional neural networks
It is demonstrated that L-CNNs can learn and generalize gauge invariant quantities that traditional convolutional neural networks are incapable of finding. Expand
Gauge Equivariant Convolutional Networks and the Icosahedral CNN
Gauge equivariant convolution using a single conv2d call is demonstrated, making it a highly scalable and practical alternative to Spherical CNNs and demonstrating substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns. Expand
Simulation of phi 4 theory in the strong coupling expansion beyond the Ising Limit
Diese Arbeit beschaftigt sich mit der Simulation der phi**4-Theorie mit dem Wurm-Algorithmus, einer Simulationsmethode die sich als sehr effizient bei der Betrachtung kritischer Systeme gezeigt hat.Expand
How (not) to Train your Generative Model: Scheduled Sampling, Likelihood, Adversary?
This paper presents a critique of scheduled sampling, a state-of-the-art training method that contributed to the winning entry to the MSCOCO image captioning benchmark in 2015, and presents the first theoretical analysis that explains why adversarial training tends to produce samples with higher perceived quality. Expand
“A and B”:
Direct fabrication of large micropatterned single crystals. p1205 21 Feb 2003. (news): Academy plucks best biophysicists from a sea of mediocrity. p994 14 Feb 2003.