# Scaling Up Machine Learning For Quantum Field Theory with Equivariant Continuous Flows

@article{Haan2021ScalingUM, title={Scaling Up Machine Learning For Quantum Field Theory with Equivariant Continuous Flows}, author={Pim de Haan and Corrado Rainone and Miranda C N Cheng and Roberto Bondesan}, journal={ArXiv}, year={2021}, volume={abs/2110.02673} }

We propose a continuous normalizing flow for sampling from the high-dimensional probability distributions of Quantum Field Theories in Physics. In contrast to the deep architectures used so far for this task, our proposal is based on a shallow design and incorporates the symmetries of the problem. We test our model on the φ theory, showing that it systematically outperforms a realNVP baseline in sampling efficiency, with the difference between the two increasing for larger lattices. On the…

## 11 Citations

### Learning Trivializing Gradient Flows for Lattice Gauge Theories

- Computer Science
- 2022

A unifying approach that starts from the perturbative construction of trivializing maps by L¨uscher and then improves on it by learning and provides a plausible path for scaling machine-learning approaches toward realistic theories.

### Stochastic normalizing flows as non-equilibrium transformations

- Computer ScienceJournal of High Energy Physics
- 2022

This work shows that the theoretical framework of stochastic normalizing flows, in which neural-network layers are combined with Monte Carlo updates, is the same that underlies out-of-equilibrium simulations based on Jarzynski’s equality, which have been recently deployed to compute free-energy differences in lattice gauge theories.

### Deformation Theory of Boltzmann Distributions

- MathematicsArXiv
- 2022

Consider a one-parameter family of Boltzmann distributions p t ( x ) = 1 Z t e − S t ( x ) . In this paper we study the problem of sampling from p t 0 by ﬁrst sampling from p t 1 and then applying a…

### Applications of Machine Learning to Lattice Quantum Field Theory

- PhysicsArXiv
- 2022

There is great potential to apply machine learning in the area of numerical lattice quantum ﬁeld theory

### Machine Learning of Thermodynamic Observables in the Presence of Mode Collapse

- PhysicsProceedings of The 38th International Symposium on Lattice Field Theory — PoS(LATTICE2021)
- 2022

Kim A. Nicoli, Christopher J. Anders, Lena Funcke, Tobias Hartung, 5 Karl Jansen, Pan Kessel, Shinichi Nakajima and Paolo Stornati Technische Universität Berlin, Machine Learning Group, Marchstrasse…

### Path-Gradient Estimators for Continuous Normalizing Flows

- Computer ScienceICML
- 2022

This work proposes a path-gradient estimator for the considerably more expressive variational family of continuous normalizing flows and outlines an efficient algorithm to calculate this estimator and establishes its superior performance empirically.

### Neural Simulated Annealing

- Business, Computer ScienceArXiv
- 2022

This work view SA from a reinforcement learning perspective and frame the proposal distribution as a policy, which can be optimised for higher solution quality given a fixed computational budget, and demonstrates that this Neural SA with such a learnt proposal distribution, parametrised by small equivariant neural networks, outperforms SA baselines on a number of problems.

### Amortized Bayesian Inference of GISAXS Data with Normalizing Flows

- Computer ScienceArXiv
- 2022

This work proposes a simulation-based framework that combines variational auto-encoders and normalizingows to estimate the posterior distribution of object parameters given its GISAXS data and demonstrates that this method reduces the inference cost by orders of magnitude while producing consistent results with ABC.

### Learning Deformation Trajectories of Boltzmann Densities

- Computer ScienceArXiv
- 2023

A training objective for continuous normalizing that can be used in the absence of samples but in the presence of an energy function is introduced that compares the reverse KL-divergence on Gaussian mixtures and on the φ 4 lattice lattice ﬁeld theory on a circle.

### Deformations of Boltzmann Distributions

- Computer Science
- 2022

An equation relating Ψ and the corresponding family of unnormalized log-likelihoods S t is derived and it is demonstrated that normalizingows perform better at learning the Boltzmann distribution p τ than at learning p 0.

## References

SHOWING 1-10 OF 30 REFERENCES

### On Estimation of Thermodynamic Observables in Lattice Field Theories with Deep Generative Models

- Computer SciencePhysical review letters
- 2021

It is shown that generative models can be used to estimate the absolute value of the free energy, which is in contrast to existing MCMC-based methods, which are limited to only estimate free energy differences.

### Equivariant flow-based sampling for lattice gauge theory

- PhysicsPhysical review letters
- 2020

We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge invariant by construction. We demonstrate the application of this framework to U(1) gauge…

### Provably efficient machine learning for quantum many-body problems

- Computer ScienceScience
- 2022

It is proved that classical ML algorithms can efficiently predict ground-state properties of gapped Hamiltonians after learning from other Hamiltonians in the same quantum phase of matter, under a widely accepted conjecture.

### Efficient modeling of trivializing maps for lattice ϕ4 theory using normalizing flows: A first look at scalability

- Computer SciencePhysical Review D
- 2021

The central idea is to use machine learning techniques to build (approximate) trivializing maps, i.e. field transformations that map the theory of interest into a ‘simpler’ theory in which the degrees of freedom decouple.

### Equivariant Flows: exact likelihood generative learning for symmetric densities

- PhysicsICML
- 2020

This work provides a theoretical sufficient criterion showing that the distribution generated by equivariant normalizing flows is invariant with respect to these symmetries by design, and proposes building blocks for flows which preserve symmetry which are usually found in physical/chemical many-body particle systems.

### Gauge covariant neural network for 4 dimensional non-abelian gauge theory

- Computer Science
- 2021

A gauge covariant neural network for four dimensional non-abelian gauge theory, which realizes a map between rank-2 tensor valued vector fields, and the smeared force in hybrid Monte Carlo (HMC) is naturally derived with the backpropagation.

### Flow-based generative models for Markov chain Monte Carlo in lattice field theory

- Computer SciencePhysical Review D
- 2019

A Markov chain update scheme using a machine-learned flow-based generative model is proposed for Monte Carlo sampling in lattice field theories and is compared with HMC and local Metropolis sampling for ϕ4 theory in two dimensions.

### Flow-based sampling for fermionic lattice field theories

- PhysicsPhysical Review D
- 2021

This research presents a novel probabilistic model that allows for the simulation of the interaction between a central nervous system and a collection of particles, called a “spatially aggregated model”.

### Reducing autocorrelation times in lattice simulations with generative adversarial networks

- Computer ScienceMach. Learn. Sci. Technol.
- 2020

This work works with a generative adversarial network (GAN) and proposes to address difficulties regarding its statistical exactness through the implementation of an overrelaxation step, by searching the latent space of the trained generator network.

### Asymptotically unbiased estimation of physical observables with neural samplers.

- Computer Science, MathematicsPhysical review. E
- 2020

This framework presents asymptotically unbiased estimators for generic observables, including those that explicitly depend on the partition function such as free energy or entropy, and derive corresponding variance estimators, and demonstrates their practical applicability by numerical experiments for the two-dimensional Ising model.