Neural Network Field Transformation and Its Application in HMC

@article{Jin2022NeuralNF,
  title={Neural Network Field Transformation and Its Application in HMC},
  author={Xiaoyan Jin},
  journal={Proceedings of The 38th International Symposium on Lattice Field Theory — PoS(LATTICE2021)},
  year={2022}
}
  • Xiaoyan Jin
  • Published 5 January 2022
  • Computer Science, Physics
  • Proceedings of The 38th International Symposium on Lattice Field Theory — PoS(LATTICE2021)
We propose a generic construction of Lie group agnostic and gauge covariant neural networks, and introduce constraints to make the neural networks continuous differentiable and invertible. We combine such neural networks and build gauge field transformations that is suitable for Hybrid Monte Carlo (HMC). We use HMC to sample lattice gauge configurations in the transformed space by the neural network parameterized gauge field transformations. Tested with 2D U(1) pure gauge systems at a range of… 

Figures from this paper

Applications of Machine Learning to Lattice Quantum Field Theory
Denis Boyda, 2 Salvatore Cal̀ı, 2 Sam Foreman, Lena Funcke, 2, 4 Daniel C. Hackett, 2, ∗ Yin Lin, 2 Gert Aarts, 6 Andrei Alexandru, 8 Xiao-Yong Jin, 9 Biagio Lucini, 11 and Phiala E. Shanahan 2, 4

References

SHOWING 1-10 OF 13 REFERENCES
Comparison of topology changing update algorithms
TLDR
This work compares the viability of multiple less commonly used algorithms (metadynamics, instanton updates, and multiscale thermalization) with respect to proper sampling of all topological sectors in the Schwinger model with the prospects of applying these methods to 4-dimensional SU(3) simulations.
‘K’
  • P. Alam
  • Composites Engineering: An A–Z Guide
  • 2021
Gauge covariant neural network for 4 dimensional non-abelian gauge theory
TLDR
A gauge covariant neural network for four dimensional non-abelian gauge theory, which realizes a map between rank-2 tensor valued vector fields, and the smeared force in hybrid Monte Carlo (HMC) is naturally derived with the backpropagation.
Sampling using SU(N) gauge equivariant flows
TLDR
A flow-based sampling algorithm for lattice gauge theories that is gauge-invariant by construction and constructing a class of flows on an SU(N) variable that respect matrix conjugation symmetry is developed.
Gaussian Error Linear Units (GELUs)
TLDR
An empirical evaluation of the GELU nonlinearity against the ReLU and ELU activations is performed and performance improvements are found across all considered computer vision, natural language processing, and speech tasks.
Hybrid Monte Carlo
Topological susceptibility of two-dimensional U(N) gauge theories
In this paper we study the topological susceptibility of two-dimensional $U(N)$ gauge theories. We provide explicit expressions for the partition function and the topological susceptibility at finite
Adam: A Method for Stochastic Optimization
TLDR
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Trivializing maps, the Wilson flow and the HMC algorithm
In lattice gauge theory, there exist field transformations that map the theory to the trivial one, where the basic field variables are completely decoupled from one another. Such maps can be
...
...