• Corpus ID: 4106658

SchNet: A continuous-filter convolutional neural network for modeling quantum interactions

@inproceedings{Schtt2017SchNetAC,
  title={SchNet: A continuous-filter convolutional neural network for modeling quantum interactions},
  author={Kristof Sch{\"u}tt and Pieter-Jan Kindermans and Huziel Enoc Sauceda Felix and Stefan Chmiela and Alexandre Tkatchenko and Klaus-Robert M{\"u}ller},
  booktitle={NIPS},
  year={2017}
}
Deep learning has the potential to revolutionize quantum chemistry as it is ideally suited to learn representations for structured data and speed up the exploration of chemical space. While convolutional neural networks have proven to be the first choice for images, audio and video data, the atoms in molecules are not restricted to a grid. Instead, their precise locations contain essential physical information, that would get lost if discretized. Thus, we propose to use continuous-filter… 

Figures and Tables from this paper

A deep neural network for molecular wave functions in quasi-atomic minimal basis representation.

This work presents an adaptation of the recently proposed SchNet for Orbitals (SchNOrb) deep convolutional neural network model for electronic wave functions in an optimized quasi-atomic minimal basis representation and discusses the future potential of this approach in quantum chemical workflows.

ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations

This work carefully design a scalable and expressive GNN model, ForceNet, and applies it to OC20 (Chanussot et al., 2020), an unprecedentedly-large dataset of quantum physics calculations, able to predict atomic forces more accurately than state-of-the-art physics-based GNNs while being faster both in training and inference.

FORCENET: A GRAPH NEURAL NETWORK FOR LARGE-SCALE QUANTUM CHEMISTRY SIMULATION

  • Computer Science
  • 2020
The potential for ML- based simulations to achieve practical usefulness while being orders of magnitude faster than physics-based simulations is demonstrated.

Quantum-chemical insights from interpretable atomistic neural networks

With the rise of deep neural networks for quantum chemistry applications, there is a pressing need for architectures that, beyond delivering accurate predictions of chemical properties, are readily

Flexible dual-branched message passing neural network for quantum mechanical property prediction with molecular conformation

The proposed dualbranched neural network for molecular property prediction based on message-passing framework outperforms other recent models with sparser representations and indicates that in the chemical property prediction tasks, the diverse chemical nature of targets should be carefully considered for both model performance and generalizability.

Cormorant: Covariant Molecular Neural Networks

Cormorant significantly outperforms competing algorithms in learning molecular Potential Energy Surfaces from conformational geometries in the MD-17 dataset, and is competitive with other methods at learning geometric, energetic, electronic, and thermodynamic properties of molecules on the GDB-9 dataset.

Solving the electronic Schrödinger equation for multiple nuclear geometries with weight-sharing deep neural networks

This work restricts the optimization process such that up to 95 percent of weights in a neural network model are in fact equal across varying molecular geometries, which opens a promising route towards pre-trained neural network wavefunctions that yield high accuracy even across different molecules.

Informing geometric deep learning with electronic interactions to accelerate quantum chemistry.

By developing a physics-inspired equivariant neural network, this work introduces a method to learn molecular representations based on the electronic interactions among atomic orbitals, which outperforms traditional semiempirical and machine learning-based methods on comprehensive downstream benchmarks that encompass diverse main-group chemical processes.

Rotation Invariant Graph Neural Networks using Spin Convolutions

A novel approach to modeling angular information between sets of neighboring atoms in a graph neural network is introduced and rotation invariance is achieved for the network’s edge messages through the use of a per-edge local coordinate frame and a novel spin convolution over the remaining degree of freedom.

Accurate and transferable multitask prediction of chemical properties with an atoms-in-molecules neural network

With AIMNet, a modular and chemically inspired deep neural network potential, a new dimension of transferability is shown: the ability to learn new targets using multimodal information from previous training.
...

References

SHOWING 1-10 OF 42 REFERENCES

Quantum-chemical insights from deep tensor neural networks

An efficient deep learning approach is developed that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems, and unifies concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate chemical space predictions.

Neural Message Passing for Quantum Chemistry

Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.

Deep Convolutional Networks on Graph-Structured Data

This paper develops an extension of Spectral Networks which incorporates a Graph Estimation procedure, that is test on large-scale classification problems, matching or improving over Dropout Networks with far less parameters to estimate.

Machine learning of molecular electronic properties in chemical compound space

The combination of modern scientific computing with electronic structure theory can lead to an unprecedented amount of data amenable to intelligent data analysis for the identification of meaningful,

Machine Learning Predictions of Molecular Properties: Accurate Many-Body Potentials and Nonlocality in Chemical Space

A systematic hierarchy of efficient empirical methods to estimate atomization and total energies of molecules and is achieved by a vectorized representation of molecules (so-called Bag of Bonds model) that exhibits strong nonlocality in chemical space.

Bypassing the Kohn-Sham equations with machine learning

The first molecular dynamics simulation with a machine-learned density functional on malonaldehyde is performed and the authors are able to capture the intramolecular proton transfer process.

Spherical convolutions and their application in molecular modelling

This paper introduces two strategies for conducting convolutions on the sphere, using either a spherical-polar grid or a grid based on the cubed-sphere representation, and demonstrates performance comparable to state-of-the-art methods in the field, which build on decades of domain-specific knowledge.

Machine learning of accurate energy-conserving molecular force fields

The GDML approach enables quantitative molecular dynamics simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods.

Spectral Networks and Locally Connected Networks on Graphs

This paper considers possible generalizations of CNNs to signals defined on more general domains without the action of a translation group, and proposes two constructions, one based upon a hierarchical clustering of the domain, and another based on the spectrum of the graph Laplacian.

How to represent crystal structures for machine learning: Towards fast prediction of electronic properties

It is found that conventional representations of the input data, such as the Coulomb matrix, are not suitable for the training of learning machines in the case of periodic solids and proposes a novel crystal structure representation for which learning and competitive prediction accuracies become possible within an unrestricted class of spd systems of arbitrary unit-cell size.