• Corpus ID: 236956831

jVMC: Versatile and performant variational Monte Carlo leveraging automated differentiation and GPU acceleration

@inproceedings{Schmitt2021jVMCVA,
  title={jVMC: Versatile and performant variational Monte Carlo leveraging automated differentiation and GPU acceleration},
  author={Markus Schmitt and Moritz Reh},
  year={2021}
}
The introduction of Neural Quantum States (NQS) has recently given a new twist to variational Monte Carlo (VMC). The ability to systematically reduce the bias of the wave function ansatz renders the approach widely applicable. However, performant implementations are crucial to reach the numerical state of the art. Here, we present a Python codebase that supports arbitrary NQS architectures and model Hamiltonians. Additionally leveraging automatic differentiation, just-in-time compilation to… 

Figures from this paper

NetKet 3: Machine Learning Toolbox for Many-Body Quantum Systems
TLDR
The most significant new feature is the possibility to define arbitrary neural network ansätze in pure Python code using the concise notation of machine-learning frameworks, which allows for just-in-time compilation as well as the implicit generation of gradients thanks to automatic differentiation.
Time-Dependent Variational Principle for Open Quantum Systems with Artificial Neural Networks.
TLDR
A variational approach to simulating the dynamics of open quantum many-body systems using deep autoregressive neural networks by employing a time-dependent variational principle.

References

SHOWING 1-10 OF 72 REFERENCES
Scalable variational Monte Carlo with graph neural ansatz
Deep neural networks have been shown as a potentially powerful ansatz in variational Monte Carlo for solving quantum many-body problems. We propose two improvements in this direction. The first is
Deep autoregressive models for the efficient variational simulation of many-body quantum systems
TLDR
This work proposes a specialized neural- network architecture that supports efficient and exact sampling, completely circumventing the need for Markov-chain sampling, and demonstrates the ability to obtain accurate results on larger system sizes than those currently accessible to neural-network quantum states.
Deep-neural-network solution of the electronic Schrödinger equation
TLDR
High-accuracy quantum chemistry methods struggle with a combinatorial explosion of Slater determinants in larger molecular systems, but now a method has been developed that learns electronic wavefunctions with deep neural networks and reaches high accuracy with only a few determinants.
Generalization properties of neural network approximations to frustrated magnet ground states
TLDR
The authors show that limited generalization capacity of neural network representations of quantum states is responsible for convergence problems for frustrated systems.
Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks
TLDR
Deep neural networks can improve the accuracy of variational quantum Monte Carlo to the point where it outperforms other ab-initio quantum chemistry methods, opening the possibility of accurate direct optimisation of wavefunctions for previously intractable molecules and solids.
Neural-network quantum state tomography
TLDR
It is demonstrated that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements, and can benefit existing and future generations of devices.
Reconstructing quantum states with generative models
TLDR
The key insight is to reduce state tomography to an unsupervised learning problem of the statistics of an informationally complete quantum measurement, which constitutes a modern machine learning approach to the validation of complex quantum devices.
Real time evolution with neural-network quantum states
TLDR
Application to the transverse-field Ising model on a one- and two-dimensional lattice exhibits an accuracy comparable to the stochastic configuration method proposed in [Carleo and Troyer, Science 355, 602-606 (2017], but does not require computing the (pseudo-)inverse of a matrix.
Role of stochastic noise and generalization error in the time propagation of neural-network quantum states
TLDR
This article shows that stable and accurate time propagation can be achieved in regimes of sufficiently regularized variational dynamics, and proposes a validation-set based diagnostic tool to help determining optimal regularization hyperparameters for t-VMC based propagation schemes.
Helping restricted Boltzmann machines with quantum-state representation by restoring symmetry
  • Y. Nomura
  • Physics, Computer Science
    Journal of physics. Condensed matter : an Institute of Physics journal
  • 2021
TLDR
This work constructs a variational wave function with one of the simplest neural networks, the restricted Boltzmann machine (RBM), and applies it to a fundamental but unsolved quantum spin Hamiltonian, the two-dimensional J 1–J 2 Heisenberg model on the square lattice.
...
...