• Corpus ID: 15927758

Learning Potential Energy Landscapes using Graph Kernels

@article{Ferr2016LearningPE,
  title={Learning Potential Energy Landscapes using Graph Kernels},
  author={Gr{\'e}goire Ferr{\'e} and Terry S. Haut and Kipton Marcos Barros},
  journal={ArXiv},
  year={2016},
  volume={abs/1612.00193}
}
Recent machine learning methods make it possible to model potential energy of atomic configurations with chemical-level accuracy (as calculated from ab-initio calculations) and at speeds suitable for molecular dynamics simulation. Best performance is achieved when the known physical constraints are encoded in the machine learning models. For example, the atomic energy is invariant under global translations and rotations; it is also invariant to permutations of same-species atoms. Although… 

Figures and Tables from this paper

Towards exact molecular dynamics simulations with machine-learned force fields
TLDR
A flexible machine-learning force-field with high-level accuracy for molecular dynamics simulations is developed, for flexible molecules with up to a few dozen atoms and insights into the dynamical behavior of these molecules are provided.

References

SHOWING 1-10 OF 49 REFERENCES
Learning over Molecules: Representations and Kernels
TLDR
On a subset of the Harvard Clean Energy Project (CEP) database, a simple fingerprint similarity kernel is found to be the fastest and most accurate for predicting HOMO-LUMO energy gap values.
Learning over Molecules : Representations and Kernels
TLDR
On a subset of the Harvard Clean Energy Project (CEP) database, a simple fingerprint similarity kernel is found to be the fastest and most accurate for predicting HOMO-LUMO energy gap values.
Gaussian Processes for Machine Learning
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
TLDR
This book is an excellent choice for readers who wish to familiarize themselves with computational intelligence techniques or for an overview/introductory course in the field of computational intelligence.
Learning Theory and Kernel Machines
  • B. Schölkopf
  • Computer Science
    Lecture Notes in Computer Science
  • 2003
TLDR
The tightest game-theoretic solution concept to which Φ-no-regret algorithms (provably) converge is correlated equilibrium, and Nash equilibrium is not a necessary outcome of learning via any Φ -no- Regret learning algorithms.
Proceedings of the 24th international conference on Machine learning
This volume contains the papers accepted to the 24th International Conference on Machine Learning (ICML 2007), which was held at Oregon State University in Corvalis, Oregon, from June 20th to 24th,
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
This section will review those books whose content and level reflect the general editorial poltcy of Technometrics. Publishers should send books for review to Ejaz Ahmed, Depatment of Mathematics and
Advances in Social Network Analysis and Mining
Committee: Reda Alhajj, Talel Abdessalem, Hamid AL-Naimiy, Frans Stokman, Rosa Benito, Noshir Contractor, Jon Rokne, Ebrahim Bagheri, Fuzheng Zhang, Jiabin Zhao, Piotr Brodka, I-Hsien Ting, Ingmar
Learning with kernels
TLDR
This book is intended to be a guide to the art of self-consistency and should not be used as a substitute for a comprehensive guide to self-confidence.
17J
  • Behler and M. Parrinello, Phys. Rev. Lett. 98,
  • 2015
...
1
2
3
4
5
...