Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting

@article{Thaler2021LearningNN,
  title={Learning neural network potentials from experimental data via Differentiable Trajectory Reweighting},
  author={Stephan Thaler and Julija Zavadlav},
  journal={Nature Communications},
  year={2021},
  volume={12}
}
In molecular dynamics (MD), neural network (NN) potentials trained bottom-up on quantum mechanical data have seen tremendous success recently. Top-down approaches that learn NN potentials directly from experimental data have received less attention, typically facing numerical and computational challenges when backpropagating through MD simulations. We present the Differentiable Trajectory Reweighting (DiffTRe) method, which bypasses differentiation through the MD simulation for time-independent… 

Deep coarse-grained potentials via relative entropy minimization.

This work demonstrates that RE training is more data efficient, due to accessing the CG distribution during training, resulting in improved free energy surfaces and reduced sensitivity to prior potentials, and supports the use of training objectives beyond FM, as a promising direction for improving CG NN potential's accuracy and reliability.

Learning Pair Potentials using Differentiable Simulations

This work proposes a general stochastic method for learning pair interactions from data using differentiable simulations (DiffSim), and demonstrates the approach by recovering simple pair potentials, such as Lennard-Jones systems, from radial distribution functions.

Scalable Bayesian Uncertainty Quantification for Neural Network Potentials: Promise and Pitfalls

It is demonstrated here that scalable Bayesian UQ via stochastic gradient MCMC (SG-MCMC) yields reliable uncertainty estimates for MD observables and that cold posteriors can reduce the required training data size and that for reliable UQ, multiple Markov chains are needed.

Forces are not Enough: Benchmark and Critical Evaluation for Machine Learning Force Fields with Molecular Simulations

A novel benchmark suite for ML MD simulation is introduced, identifying stability as a key metric for ML models to improve and illustrating, in particular, how the commonly benchmarked force accuracy is not well aligned with relevant simulation metrics.

Machine Learning Coarse-Grained Potentials of Protein Thermodynamics

It is shown that a single coarse-grained potential can integrate all twelve proteins and can capture experimental structural features of mutated proteins, indicating that machine learning coarse- grained potentials could provide a feasible approach to simulate and understand protein dynamics.

Coarse-grained molecular dynamics study based on TorchMD

The workflow in this work provides another option to study the protein folding and other relative processes with the deep learning CG model and shows that the main phenomenon of protein folding with TorchMD CG model is the same as the all-atom simulations, but with a less simulating time scale.

Machine Learning and Optoelectronic Materials Discovery: A Growing Synergy.

Novel optoelectronic materials have the potential to revolutionize the ongoing green transition by both providing more efficient photovoltaic (PV) devices and lowering energy consumption of devices

Machine learned coarse-grained protein force-fields: Are we there yet?

References

SHOWING 1-10 OF 83 REFERENCES

Machine Learning of Coarse-Grained Molecular Dynamics Force Fields

CGnets, a deep learning approach, that learns coarse-grained free energy functions and can be trained by a force-matching scheme, is introduced, which shows that CGnets can capture all-atom explicit-solvent free energy surfaces with models using only a few coarse- grained beads and no solvent, while classical coarse-Graining methods fail to capture crucial features of the free energy surface.

Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning

  • Justin S. SmithBenjamin T. Nebgen Adrian E. Roitberg
  • Chemistry, Computer Science
    Nature Communications
  • 2019
A general-purpose neural network potential is trained that approaches quantum-chemical accuracy for reaction thermochemistry, isomerization, and drug-like molecular torsions and is broadly applicable to materials science, biology, and chemistry.

Differentiable sampling of molecular geometries with uncertainty-based adversarial attacks

Automatic differentiation is exploited to drive atomistic systems towards high-likelihood, high-uncertainty configurations without the need for molecular dynamics simulations by performing adversarial attacks on an uncertainty metric.

TorchMD: A Deep Learning Framework for Molecular Simulations

TorchMD is presented, a framework for molecular simulations with mixed classical and machine learning potentials that enables learning and simulating neural network potentials and provides a useful tool set to support molecular simulations of machinelearning potentials.

Machine learning coarse grained models for water

  • Henry ChanMathew J. Cherukara Subramanian K. R. S. Sankaranarayanan
  • Computer Science
    Nature Communications
  • 2019
A set of machine-learned coarse-grained water models that accurately describe the structure and thermodynamic anomalies of both water and ice at mesoscopic scales are introduced, all at two orders of magnitude cheaper computational cost than existing atomistic models.

Quantum-chemical insights from deep tensor neural networks

An efficient deep learning approach is developed that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems, and unifies concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate chemical space predictions.

SchNet: A continuous-filter convolutional neural network for modeling quantum interactions

This work proposes to use continuous-filter convolutional layers to be able to model local correlations without requiring the data to lie on a grid, and obtains a joint model for the total energy and interatomic forces that follows fundamental quantum-chemical principles.

DeePCG: Constructing coarse-grained models via deep neural networks.

It is found that the two-body, three- body, and higher-order oxygen correlation functions produced by the coarse-grained and full atomistic models agree very well with each other, illustrating the effectiveness of the DeePCG model on a rather challenging task.

Machine learning for molecular simulation

Recent ML methods for molecular simulation are reviewed, with particular focus on (deep) neural networks for the prediction of quantum-mechanical energies and forces, on coarse-grained molecular dynamics, on the extraction of free energy surfaces and kinetics, and on generative network approaches to sample molecular equilibrium structures and compute thermodynamics.

Prediction Errors of Molecular Machine Learning Models Lower than Hybrid DFT Error.

Numerical evidence that ML model predictions deviate from DFT less than DFT (B3LYP) deviates from experiment for all properties is presented and suggests that ML models could be more accurate than hybrid DFT if explicitly electron correlated quantum (or experimental) data were available.
...