Generalized neural-network representation of high-dimensional potential-energy surfaces.

@article{Behler2007GeneralizedNR,
  title={Generalized neural-network representation of high-dimensional potential-energy surfaces.},
  author={J{\"o}rg Behler and Michele Parrinello},
  journal={Physical review letters},
  year={2007},
  volume={98 14},
  pages={
          146401
        }
}
The accurate description of chemical processes often requires the use of computationally demanding methods like density-functional theory (DFT), making long simulations of large systems unfeasible. In this Letter we introduce a new kind of neural-network representation of DFT potential-energy surfaces, which provides the energy and forces as a function of all atomic positions in systems of arbitrary size and is several orders of magnitude faster than DFT. The high accuracy of the method is… 

Figures from this paper

Atom-centered symmetry functions for constructing high-dimensional neural network potentials.
  • J. Behler
  • Computer Science
    The Journal of chemical physics
  • 2011
TLDR
Neural networks offer an unbiased and numerically very accurate approach to represent high-dimensional ab initio potential-energy surfaces and a transformation to symmetry functions is required to enable molecular dynamics simulations of large systems.
High-Dimensional Neural Network Potentials for Atomistic Simulations
High-dimensional neural network potentials, proposed by Behler and Parrinello in 2007, have become an established method to calculate potential energy surfaces with first-principles accuracy at a
Construction of high-dimensional neural network potentials using environment-dependent atom pairs.
TLDR
This work presents an implementation of an NN method based on atom pairs, and its accuracy and performance are compared to the atom-based NN approach, with the pair-based method yielding a slightly higher accuracy making it a competitive alternative for addressing complex systems in MD simulations.
Deep Potential: a general representation of a many-body potential energy surface
TLDR
Deep Potential is able to reproduce the original model, whether empirical or quantum mechanics based, within chemical accuracy, and the computational cost of this new model is not substantially larger than that of empirical force fields.
Representing molecule-surface interactions with symmetry-adapted neural networks.
TLDR
This work builds the NN on a new type of symmetry functions, which allows to take the symmetry of the surface exactly into account and is illustrated by the application to a six-dimensional PES describing the interaction of oxygen molecules with the Al(111) surface.
Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations.
  • J. Behler
  • Physics
    Physical chemistry chemical physics : PCCP
  • 2011
TLDR
In this Perspective, the current status of NN potentials is reviewed, and their advantages and limitations are discussed.
A novel approach to describe chemical environments in high-dimensional neural network potentials.
TLDR
A set of invariant, orthogonal, and differentiable descriptors for an atomic environment is proposed, implemented in a neural network potential for solid-state silicon, and tested in molecular dynamics simulations.
Representing potential energy surfaces by high-dimensional neural network potentials.
  • J. Behler
  • Materials Science
    Journal of physics. Condensed matter : an Institute of Physics journal
  • 2014
TLDR
The basic methodology of high-dimensional NNPs will be presented with a special focus on the scope and the remaining limitations of this approach, e.g. for addressing problems in materials science, for investigating properties of interfaces, and for studying solvation processes.
High-dimensional potential energy surfaces for molecular simulations: from empiricism to machine learning
TLDR
An overview of computational methods to describe high-dimensional potential energy surfaces suitable for atomistic simulations is given, including empirical force fields, representations based on reproducing kernels, using permutationally invariant polynomials, and neural network-learned representations and combinations thereof.
...
...

References

SHOWING 1-10 OF 10 REFERENCES
Proc
  • Natl. Acad. Sci. U.S.A. 99, 12 562
  • 2002
Europhys
  • Lett. 26, 583
  • 1994
Phys
  • Rev. B 56, 8542
  • 1997
Phys
  • Rev. Lett. 90, 075503
  • 2003
Comput
  • Phys. Commun. 148, 124
  • 2002
Phys
  • Lett. B 195, 216
  • 1987
Phys
  • Rev. B 38, 9902
  • 1988
Phys
  • Rev. B 41, 7892
  • 1990
Phys
  • Rev. 140, A1133
  • 1965
Phys
  • Rev. Lett. 98, 066401
  • 2007