Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting

  title={Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting},
  author={Stephan Thaler and Julija Zavadlav},
  journal={Nature Communications},
In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through MD simulations. We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent… 

Deep Coarse-grained Potentials via Relative Entropy Minimization

It is demonstrated for benchmark problems of liquid water and alanine dipeptide that RE training is more data due to accessing the CG distribution during training, resulting in improved free energy surfaces and reduced sensitivity to prior potentials.

Learning Pair Potentials using Differentiable Simulations

It is shown that DiffSim can be used to probe a wider functional space of pair potentials compared to traditional methods like Iterative Boltzmann Inversion and can beUsed to simultaneously learn potentials for simulations at different compositions and temperatures to improve the transferability of the learned potentials.

Forces are not Enough: Benchmark and Critical Evaluation for Machine Learning Force Fields with Molecular Simulations

A novel benchmark suite for ML MD simulation is introduced, identifying stability as a key metric for ML models to improve and illustrating, in particular, how the commonly benchmarked force accuracy is not well aligned with relevant simulation metrics.

Coarse-grained molecular dynamics study based on TorchMD

The workflow in this work provides another option to study the protein folding and other relative processes with the deep learning CG model and shows that the main phenomenon of protein folding with TorchMD CG model is the same as the all-atom simulations, but with a less simulating time scale.

Machine Learning and Optoelectronic Materials Discovery: A Growing Synergy.

Novel optoelectronic materials have the potential to revolutionize the ongoing green transition by both providing more efficient photovoltaic (PV) devices and lowering energy consumption of devices



Machine Learning of Coarse-Grained Molecular Dynamics Force Fields

CGnets, a deep learning approach, that learns coarse-grained free energy functions and can be trained by a force-matching scheme, is introduced, which shows that CGnets can capture all-atom explicit-solvent free energy surfaces with models using only a few coarse- grained beads and no solvent, while classical coarse-Graining methods fail to capture crucial features of the free energy surface.

Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning

A general-purpose neural network potential is trained that approaches CCSD(T)/CBS accuracy on benchmarks for reaction thermochemistry, isomerization, and drug-like molecular torsions.

Differentiable sampling of molecular geometries with uncertainty-based adversarial attacks

Automatic differentiation is exploited to drive atomistic systems towards high-likelihood, high-uncertainty configurations without the need for molecular dynamics simulations by performing adversarial attacks on an uncertainty metric.

TorchMD: A Deep Learning Framework for Molecular Simulations

TorchMD is presented, a framework for molecular simulations with mixed classical and machine learning potentials that enables learning and simulating neural network potentials and provides a useful tool set to support molecular simulations of machinelearning potentials.

Machine learning coarse grained models for water

A machine-learned coarse-grained water model to elucidate the ice nucleation process much more efficiently than previous models is developed, in a significant departure from conventional force-field fitting.

Quantum-chemical insights from deep tensor neural networks

An efficient deep learning approach is developed that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems, and unifies concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate chemical space predictions.

SchNet: A continuous-filter convolutional neural network for modeling quantum interactions

This work proposes to use continuous-filter convolutional layers to be able to model local correlations without requiring the data to lie on a grid, and obtains a joint model for the total energy and interatomic forces that follows fundamental quantum-chemical principles.

DeePCG: Constructing coarse-grained models via deep neural networks.

It is found that the two-body, three- body, and higher-order oxygen correlation functions produced by the coarse-grained and full atomistic models agree very well with each other, illustrating the effectiveness of the DeePCG model on a rather challenging task.

Machine learning for molecular simulation

Recent ML methods for molecular simulation are reviewed, with particular focus on (deep) neural networks for the prediction of quantum-mechanical energies and forces, on coarse-grained molecular dynamics, on the extraction of free energy surfaces and kinetics, and on generative network approaches to sample molecular equilibrium structures and compute thermodynamics.

Prediction Errors of Molecular Machine Learning Models Lower than Hybrid DFT Error.

Numerical evidence that ML model predictions deviate from DFT less than DFT (B3LYP) deviates from experiment for all properties is presented and suggests that ML models could be more accurate than hybrid DFT if explicitly electron correlated quantum (or experimental) data were available.