Representations of molecules and materials for interpolation of quantum-mechanical simulations via machine learning

@article{Langer2022RepresentationsOM,
  title={Representations of molecules and materials for interpolation of quantum-mechanical simulations via machine learning},
  author={Marcel F. Langer and Alex Goessmann and Matthias Rupp},
  journal={npj Computational Materials},
  year={2022},
  volume={8},
  pages={1-14}
}
Computational study of molecules and materials from first principles is a cornerstone of physics, chemistry, and materials science, but limited by the cost of accurate and precise simulations. In settings involving many simulations, machine learning can reduce these costs, often by orders of magnitude, by interpolating between reference simulations. This requires representations that describe any molecule or material and support interpolation. We comprehensively review and discuss current… 
Entangling Solid Solutions: Machine Learning of Tensor Networks for Materials Property Prediction
TLDR
It is argued that architectures based on tensor networks are well-suited to machine learning on Hilbert-space representations of atomic structures and shown that certain standard tensor network topologies exhibit strong generalizability even on small training datasets while being parametrically efficient.
GPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations
We present our latest advancements of machine-learned potentials (MLPs) based on the neuroevolution potential (NEP) framework introduced in [Fan et al. , Phys. Rev. B 104 , 104309 (2021)] and their
Ab initio machine learning of phase space averages.
TLDR
This work demonstrates AIML for predicting Boltzmann averaged structures after training on hundreds of MD trajectories and contextualizes the findings by comparison to state-of-the-art methods resulting in a Pareto plot for the free energy of solvation predictions in terms of accuracy and time.
Cutting Force Prediction of Ti6Al4V using a Machine Learning Model of SPH Orthogonal Cutting Process Simulations
The prediction of machining processes is a challenging task and usually requires a large experimental basis. These experiments are time-consuming and require manufacturing and testing of different
Predicting tensorial molecular properties with equivariant machine learning models
TLDR
A scalable equivariant machine-learning model based on local atomic environment descriptors is formulated and applied to a series of molecules and accurate predictions can be achieved for a comprehensive list of dielectric and magnetic tensorial properties of different ranks.
KLIFF: A framework to develop physics-based and machine learning interatomic potentials
From mechanism-based to data-driven approaches in materials science
A time-honored approach in theoretical materials science revolves around the search for basic mechanisms that should incorporate key feature of the phenomenon under investigation. Recent years have
KLIFF: A framework to develop analytic and machine learning interatomic potentials
TLDR
The KIM-based learning-integrated fitting framework (KLIFF), a package that facilitates the entire IP development process, is introduced and demonstrated the use of KLIFF by fitting an analytic Stillinger–Weber potential and a machine learning neural network potential for silicon.
Efficient Gaussian process regression for prediction of molecular crystals harmonic free energies
We present a method to accurately predict the Helmholtz harmonic free energies of molecular crystals in high-throughput settings. This is achieved by devising a computationally efficient framework
...
...

References

SHOWING 1-10 OF 233 REFERENCES
Fingerprint-Based Detection of Non-Local Effects in the Electronic Structure of a Simple Single Component Covalent System
TLDR
Using fingerprints used mainly in machine learning schemes of the potential energy surface, long range effects on local physical properties in a simple covalent system of carbon atoms are detected in a fully algorithmic way.
Through the eyes of a descriptor: Constructing complete, invertible descriptions of atomic environments
  • M. Uhrin
  • Computer Science
    Physical Review B
  • 2021
TLDR
The combined ability to both decode and make property predictions from a representation that does not need to be learned lays the foundations for a novel way of building generative models that are tasked with solving the inverse problem of predicting atomic arrangements that are statistically likely to have certain desired properties.
Performant implementation of the atomic cluster expansion (PACE) and application to copper and silicon
The atomic cluster expansion is a general polynomial expansion of the atomic energy in multi-atom basis functions. Here we implement the atomic cluster expansion in the performant C++ code PACE that
Efficient implementation of atom-density representations.
TLDR
An implementation of librascal, whose modular design lends itself both to developing refinements to the density-based formalism and to rapid prototyping for new developments of rotationally equivariant atomistic representations, is presented.
Physics-Inspired Structural Representations for Molecules and Materials.
TLDR
This review summarizes the current understanding of the nature and characteristics of the most commonly used structural and chemical descriptions of atomistic structures, highlighting the deep underlying connections between different frameworks and the ideas that lead to computationally efficient and universally applicable models.
Analytical gradients for molecular-orbital-based machine learning.
TLDR
The derivation, implementation, and numerical demonstration of MOB-ML analytical nuclear gradients, which are formulated in a general Lagrangian framework to enforce orthogonality, localization, and Brillouin constraints on the molecular orbitals, are presented.
Ab Initio Machine Learning in Chemical Compound Space
TLDR
While state-of-the-art approximations to quantum problems impose severe computational bottlenecks, recent QML based developments indicate the possibility of substantial acceleration without sacrificing the predictive power of quantum mechanics.
Machine learning of solvent effects on molecular spectra and reactions
TLDR
The deep neural network FieldSchNet is introduced for modeling the interaction of molecules with arbitrary external fields, and an external environment capable of lowering the activation barrier of the rearrangement reaction significantly is designed, demonstrating promising venues for inverse chemical design.
Improved accuracy and transferability of molecular-orbital-based machine learning: Organics, transition-metal complexes, non-covalent interactions, and transition states.
TLDR
It is shown that MOB-ML also works well for extrapolating to transition-state structures, predicting the barrier region for malonaldehyde intramolecular proton-transfer to within 0.35 kcal/mol when only trained on reactant/product-like structures.
A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer
TLDR
This work introduces a fourth-generation high-dimensional neural network potential that combines a charge equilibration scheme employing environment-dependent atomic electronegativities with accurate atomic energies and substantially extends the applicability of modern machine learning potentials.
...
...