SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects
@article{Unke2021SpookyNetLF, title={SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects}, author={Oliver T. Unke and Stefan Chmiela and Michael Gastegger and Kristof T. Sch{\"u}tt and Huziel E. Sauceda and Klaus-Robert M{\"u}ller}, journal={Nature Communications}, year={2021}, volume={12} }
Machine-learned force fields combine the accuracy of ab initio methods with the efficiency of conventional force fields. However, current machine-learned force fields typically ignore electronic degrees of freedom, such as the total charge or spin state, and assume chemical locality, which is problematic when molecules have inconsistent electronic states, or when nonlocal effects play a significant role. This work introduces SpookyNet, a deep neural network for constructing machine-learned…
30 Citations
Orbital Mixer: Using Atomic Orbital Features for Basis Dependent Prediction of Molecular Wavefunctions
- Chemistry
- 2022
Leveraging ab initio data at scale has enabled the development of machine learning models capable of extremely accurate and fast molecular property prediction. A central paradigm of many previous…
Towards Universal Neural Network Potential for Material Discovery Applicable to Arbitrary Combination of 45 Elements
- Materials Science
- 2021
Computational material discovery is under intense study owing to its ability to explore the vast space of chemical systems. Neural network potentials (NNPs) have been shown to be particularly…
Accurate Machine Learned Quantum-Mechanical Force Fields for Biomolecular Simulations
- Chemistry
- 2022
Molecular dynamics (MD) simulations allow atomistic insights into chemical and biological processes. Accurate MD simulations require computationally demanding quantum-mechanical calculations, being…
Accurate and efficient molecular dynamics based on machine learning and non von Neumann architecture
- Computer Sciencenpj Computational Materials
- 2022
By testing on different molecules and bulk systems, it is shown that the proposed molecular dynamics (MD) methodology is generally-applicable to various MD tasks.
Atomistic Simulations for Reactions and Vibrational Spectroscopy in the Era of Machine Learning─Quo Vadis?
- PhysicsThe journal of physical chemistry. B
- 2022
Atomistic simulations using accurate energy functions can provide molecular-level insight into functional motions of molecules in the gas and in the condensed phase. This Perspective delineates the…
Automatic Identification of Chemical Moieties
- Computer ScienceArXiv
- 2022
This work introduces a method to automatically identify chemical moieties (molecular building blocks) from atomic representations, enabling a variety of applications beyond property prediction, which otherwise rely on expert knowledge.
E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials
- Computer ScienceNature communications
- 2022
The NequIP method achieves state-of-the-art accuracy on a challenging and diverse set of molecules and materials while exhibiting remarkable data efficiency, challenging the widely held belief that deep neural networks require massive training sets.
Efficient force field and energy emulation through partition of permutationally equivalent atoms.
- Computer ScienceThe Journal of chemical physics
- 2022
A new approach, the atomized force field (AFF) model, that integrates both force and energy in the emulator with many fewer computational operations is introduced, and contains uncertainty assessment of predictions of atomic forces and energies, useful for developing a sequential design over the chemical input space.
Equivariant Graph Attention Networks for Molecular Property Prediction
- Computer ScienceArXiv
- 2022
This work proposes an equivariant GNN that operates with Cartesian coordinates to incorporate directionality and implements a novel attention mechanism, acting as a content and spatial dependent filter when propagating information between nodes.
Graph Neural Networks Accelerated Molecular Dynamics
- Computer ScienceThe Journal of chemical physics
- 2022
A GNN Accelerated MD (GAMD) model is developed that directly predicts forces, given the state of the system, bypassing the evaluation of potential energy and is agnostic to the scale, where it can scale to much larger systems at test time.
References
SHOWING 1-10 OF 137 REFERENCES
Incompleteness of Atomic Structure Representations.
- Computer SciencePhysical review letters
- 2020
It is shown that any classifier, regression, or embedding model for atom-centered properties that uses three- (or four)-body features will incorrectly give identical results for different configurations, and the issues that will arise as the desired accuracy increases, and suggest potential solutions.
Solving the electronic Schrödinger equation for multiple nuclear geometries with weight-sharing deep neural networks
- Computer ScienceNature Computational Science
- 2022
This work restricts the optimization process such that up to 95 percent of weights in a neural network model are in fact equal across varying molecular geometries, which opens a promising route towards pre-trained neural network wavefunctions that yield high accuracy even across different molecules.
A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer
- Physics, Computer ScienceNature communications
- 2021
This work introduces a fourth-generation high-dimensional neural network potential that combines a charge equilibration scheme employing environment-dependent atomic electronegativities with accurate atomic energies and substantially extends the applicability of modern machine learning potentials.
Combining Machine Learning and Computational Chemistry for Predictive Insights Into Chemical Systems
- Computer ScienceChemical reviews
- 2021
A critical review of noteworthy applications that demonstrate how computational chemistry and machine learning can be used together to provide insightful predictions in molecular and materials modeling, retrosyntheses, catalysis, and drug design are reviewed.
Equivariant message passing for the prediction of tensorial properties and molecular spectra
- Computer ScienceICML
- 2021
This work proposes the polarizable atom interaction neural network (PAINN) and improves on common molecule benchmarks over previous networks, while reducing model size and inference time and leverage the equivariant atomwise representations obtained by PAINN for the prediction of tensorial properties.
Machine Learning Force Fields
- Computer Science, ChemistryChemical reviews
- 2021
An overview of applications of ML-FFs and the chemical insights that can be obtained from them is given, and a step-by-step guide for constructing and testing them from scratch is given.
QM7- X
- a comprehensive dataset of quantum-mechanical properties spanning the chemical space of small organic molecules, Scientific Data 8, 43
- 2021
QM7-X, a comprehensive dataset of quantum-mechanical properties spanning the chemical space of small organic molecules
- ChemistryScientific data
- 2021
QM7-X is introduced, a comprehensive dataset of 42 physicochemical properties for ≈4.2 million equilibrium and non-equilibrium structures of small organic molecules with up to seven non-hydrogen atoms that will play a critical role in the development of next-generation machine-learning based models for exploring greater swaths of CCS and performing in silico design of molecules with targeted properties.
Rethinking Attention with Performers
- Computer ScienceICLR
- 2021
Performers, Transformer architectures which can estimate regular (softmax) full-rank-attention Transformers with provable accuracy, but using only linear space and time complexity, without relying on any priors such as sparsity or low-rankness are introduced.
SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials
- Computer ScienceArXiv
- 2021
The NequIP method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency, challenging the widely held belief that deep neural networks require massive training sets.