Learning Lattice Quantum Field Theories with Equivariant Continuous Flows

@article{Gerdes2022LearningLQ,
  title={Learning Lattice Quantum Field Theories with Equivariant Continuous Flows},
  author={Mathis Gerdes and Pim de Haan and Corrado Rainone and Roberto Bondesan and Miranda C N Cheng},
  journal={ArXiv},
  year={2022},
  volume={abs/2207.00283}
}
We propose a novel machine learning method for sampling from the high-dimensional probability distributions of Lattice Quantum Field Theories. Instead of the deep architectures used so far for this task, our proposal is based on a single neural ODE layer and incorporates the full symmetries of the problem. We test our model on the φ 4 theory, showing that it systematically outperforms previously proposed flow-based methods in sampling efficiency, and the improvement is especially pronounced for… 

Figures and Tables from this paper

Geometrical aspects of lattice gauge equivariant convolutional neural networks

It is demonstrated how L-CNNs can be equipped with global group equivariance, which allows the formulation to be equivariant not just under translations but under global lattice symmetries such as rotations and reflections.

Learning Deformation Trajectories of Boltzmann Densities

A training objective for continuous normalizing that can be used in the absence of samples but in the presence of an energy function is introduced that compares the reverse KL-divergence on Gaussian mixtures and on the φ 4 lattice lattice field theory on a circle.

Aspects of scaling and scalability for flow-based sampling of lattice QCD

Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing. However,

Deformation Theory of Boltzmann Distributions

Consider a one-parameter family of Boltzmann distributions p t ( x ) = 1 Z t e − S t ( x ) . In this paper we study the problem of sampling from p t 0 by first sampling from p t 1 and then applying a

Deformations of Boltzmann Distributions

An equation relating Ψ and the corresponding family of unnormalized log-likelihoods S t is derived and it is demonstrated that normalizingows perform better at learning the Boltzmann distribution p τ than at learning p 0.

Stochastic normalizing flows for lattice field theory

Stochastic normalizing flows are a class of deep generative models that combine normalizing flows with Monte Carlo updates and can be used in lattice field theory to sample from Boltzmann distributions.

References

SHOWING 1-10 OF 35 REFERENCES

Efficient modeling of trivializing maps for lattice ϕ4 theory using normalizing flows: A first look at scalability

The central idea is to use machine learning techniques to build (approximate) trivializing maps, i.e. field transformations that map the theory of interest into a ‘simpler’ theory in which the degrees of freedom decouple.

Phys

  • Rev. D 104, 094507
  • 2021

Neural Ordinary Differential Equations

This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Gauge covariant neural network for 4 dimensional non-abelian gauge theory

It is found that the smearing procedure can be regarded as extended versions of residual neural networks with fixed parameters, and the self-learning hybrid Monte-Carlo for two-color QCD is developed, where results are consistent with the results of the Hybrid Monte Carlo.

Introduction to Normalizing Flows for Lattice Field Theory

This paper presents a probabilistic simulation of the response of a drug-like substance to a proton-proton collision and shows clear patterns in how the substance’s response to the drug affects its interaction with the drug.

and a at

The xishacorene natural products are structurally unique apolar diterpenoids that feature a bicyclo[3.3.1] framework. These secondary metabolites likely arise from the well-studied, structurally

Self-learning Monte Carlo for non-Abelian gauge theory with dynamical fermions

In this paper, we develop the self-learning Monte-Carlo (SLMC) algorithm for non-abelian gauge theory with dynamical fermions in four dimensions to resolve the autocorrelation problem in lattice QCD.

Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains

An approach for selecting problem-specific Fourier features that greatly improves the performance of MLPs for low-dimensional regression tasks relevant to the computer vision and graphics communities is suggested.

Equivariant Flows: exact likelihood generative learning for symmetric densities

This work provides a theoretical sufficient criterion showing that the distribution generated by equivariant normalizing flows is invariant with respect to these symmetries by design, and proposes building blocks for flows which preserve symmetry which are usually found in physical/chemical many-body particle systems.

and s