• Corpus ID: 236956831

jVMC: Versatile and performant variational Monte Carlo leveraging automated differentiation and GPU acceleration

@inproceedings{Schmitt2021jVMCVA,
  title={jVMC: Versatile and performant variational Monte Carlo leveraging automated differentiation and GPU acceleration},
  author={Markus Schmitt and Moritz Reh},
  year={2021}
}
The introduction of Neural Quantum States (NQS) has recently given a new twist to variational Monte Carlo (VMC). The ability to systematically reduce the bias of the wave function ansatz renders the approach widely applicable. However, performant implementations are crucial to reach the numerical state of the art. Here, we present a Python codebase that supports arbitrary NQS architectures and model Hamiltonians. Additionally leveraging automatic differentiation, just-in-time compilation to… 

Figures from this paper

NetKet 3: Machine Learning Toolbox for Many-Body Quantum Systems
TLDR
The most significant new feature is the possibility to define arbitrary neural network ansätze in pure Python code using the concise notation of machine-learning frameworks, which allows for just-in-time compilation as well as the implicit generation of gradients thanks to automatic differentiation.
Time-Dependent Variational Principle for Open Quantum Systems with Artificial Neural Networks.
TLDR
A variational approach to simulating the dynamics of open quantum many-body systems using deep autoregressive neural networks by employing a time-dependent variational principle.

References

SHOWING 1-10 OF 72 REFERENCES
Scalable variational Monte Carlo with graph neural ansatz
Deep neural networks have been shown as a potentially powerful ansatz in variational Monte Carlo for solving quantum many-body problems. We propose two improvements in this direction. The first is
Deep autoregressive models for the efficient variational simulation of many-body quantum systems
TLDR
This work proposes a specialized neural- network architecture that supports efficient and exact sampling, completely circumventing the need for Markov-chain sampling, and demonstrates the ability to obtain accurate results on larger system sizes than those currently accessible to neural-network quantum states.
Generalization properties of neural network approximations to frustrated magnet ground states
TLDR
The authors show that limited generalization capacity of neural network representations of quantum states is responsible for convergence problems for frustrated systems.
Neural-network quantum state tomography
TLDR
It is demonstrated that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements, and can benefit existing and future generations of devices.
Reconstructing quantum states with generative models
TLDR
The key insight is to reduce state tomography to an unsupervised learning problem of the statistics of an informationally complete quantum measurement, which constitutes a modern machine learning approach to the validation of complex quantum devices.
Time-Dependent Variational Principle for Open Quantum Systems with Artificial Neural Networks
TLDR
A variational approach to simulating the dynamics of open quantum many-body systems using deep autoregressive neural networks by employing a time-dependent variational principle.
Two-dimensional frustrated J1−J2 model studied with neural network quantum states
TLDR
This paper uses a fully convolutional neural network model as a variational ansatz to study the frustrated spin-1/2 J1-J2 Heisenberg model on the square lattice and demonstrates that the resulting predictions for both ground-state energies and properties are competitive with, and often improve upon, existing state-of-the-art methods.
Symmetries and Many-Body Excitations with Neural-Network Quantum States.
TLDR
Interestingly, it is found that deep networks typically outperform shallow architectures for high-energy states, and an algorithm to compute low-lying excited states without symmetries is given.
Neural-network states for the classical simulation of quantum computing
TLDR
A classical approach to the simulation of general quantum circuits based on neural-network quantum states (NQS) representations is introduced, and rules for exactly applying single-qubit and two-qu bit Z rotations to NQS are derived, whereas a learning scheme is provided to approximate the action of Hadamard gates.
Investigating ultrafast quantum magnetism with machine learning
We investigate the efficiency of the recently proposed Restricted Boltzmann Machine (RBM) representation of quantum many-body states to study both the static properties and quantum spin dynamics in
...
1
2
3
4
5
...