Corpus ID: 236635384

MLMOD Package: Machine Learning Methods for Data-Driven Modeling in LAMMPS

  title={MLMOD Package: Machine Learning Methods for Data-Driven Modeling in LAMMPS},
  author={Paul J. Atzberger},
  • P. Atzberger
  • Published 2021
  • Computer Science, Physics, Mathematics
  • ArXiv
We discuss a software package for incorporating into simulations models obtained from machine learning methods. These can be used for (i) modeling dynamics and time-step integration, (ii) modeling interactions between system components, and (iii) computing quantities of interest characterizing system state. The package allows for use of machine learning methods with general model classes including Neural Networks, Gaussian Process Regression, Kernel Models, and other approaches. We discuss in… Expand

Figures from this paper


Discovering governing equations from data by sparse identification of nonlinear dynamical systems
This work develops a novel framework to discover governing equations underlying a dynamical system simply from data measurements, leveraging advances in sparsity techniques and machine learning and using sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data. Expand
Auto-Encoding Variational Bayes
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Chapters 2–7 make up Part II of the book: artificial neural networks. After introducing the basic concepts of neurons and artificial neuron learning rules in Chapter 2, Chapter 3 describes aExpand
Importance of the Mathematical Foundations of Machine Learning Methods for Scientific and Engineering Applications
There is a strong need for further mathematical developments on the foundations of machine learning methods to increase the level of rigor of employed methods and to ensure more reliable and interpretable results. Expand
Reducing the Dimensionality of Data with Neural Networks
This work describes an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data. Expand
Variational Inference: A Review for Statisticians
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived. Expand
The Elements of Statistical Learning
  • E. Ziegel
  • Computer Science, Mathematics
  • Technometrics
  • 2003
Chapter 11 includes more case studies in other areas, ranging from manufacturing to marketing research, and a detailed comparison with other diagnostic tools, such as logistic regression and tree-based methods. Expand
Distilling Free-Form Natural Laws from Experimental Data
This work proposes a principle for the identification of nontriviality, and demonstrated this approach by automatically searching motion-tracking data captured from various physical systems, ranging from simple harmonic oscillators to chaotic double-pendula, and discovered Hamiltonians, Lagrangians, and other laws of geometric and momentum conservation. Expand
CHARMM: A program for macromolecular energy, minimization, and dynamics calculations
CHARMM (Chemistry at HARvard Macromolecular Mechanics) is a highly flexible computer program which uses empirical energy functions to model macromolecular systems. The program can read or model buildExpand
The computational future for climate and Earth system models: on the path to petaflop and beyond
  • W. Washington, Lawrence Buja, A. Craig
  • Environmental Science, Medicine
  • Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2008
Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems. Expand