• Corpus ID: 237532722

Differentiable Physics: A Position Piece

  title={Differentiable Physics: A Position Piece},
  author={Bharath Ramsundar and Dilip Krishnamurthy and Venkatasubramanian Viswanathan},
Differentiable physics provides a new approach for modeling and understanding the physical systems by pairing the new technology of differentiable programming with classical numerical methods for physical simulation. We survey the rapidly growing literature of differentiable physics techniques and highlight methods for parameter estimation, learning representations, solving differential equations, and developing what we call scientific foundation models using data and inductive priors. We argue… 

Figures from this paper


The imperative of physics-based modeling and inverse theory in computational science
To best learn from data about large-scale complex systems, physics-based models representing the laws of nature must be integrated into the learning process. Inverse theory provides a crucial
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
Abstract We introduce physics-informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinear
Machine learning–accelerated computational fluid dynamics
It is shown that using machine learning inside traditional fluid simulations can improve both accuracy and speed, even on examples very different from the training data, which opens the door to applying machine learning to large-scale physical modeling tasks like airplane design and climate prediction.
Differentiable thermodynamic modeling
A new framework of thermodynamic modeling is proposed by introducing the concept of differentiable programming, where all the thermodynamic observables including both thermochemical quantities and
Simulating Continuum Mechanics with Multi-Scale Graph Neural Networks
The proposed MultiScaleGNN model is a novel multi-scale graph neural network model for learning to infer unsteady continuum mechanics that can generalise from uniform advection fields to high-gradient fields on complex domains at test time and infer long-term Navier-Stokes solutions within a range of Reynolds numbers.
Fourier Neural Operator for Parametric Partial Differential Equations
This work forms a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture and shows state-of-the-art performance compared to existing neural network methodologies.
Hamiltonian Neural Networks
Inspiration from Hamiltonian mechanics is drawn to train models that learn and respect exact conservation laws in an unsupervised manner, and this model trains faster and generalizes better than a regular neural network.
DiffTaichi: Differentiable Programming for Physical Simulation
We present DiffTaichi, a new differentiable programming language tailored for building high-performance differentiable physical simulators. Based on an imperative programming language, DiffTaichi
Solving high-dimensional partial differential equations using deep learning
A deep learning-based approach that can handle general high-dimensional parabolic PDEs using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function.
Physics Guided RNNs for Modeling Dynamical Systems: A Case Study in Simulating Lake Temperature Profiles
It is shown that a PGRNN can improve prediction accuracy over that of physical models, while generating outputs consistent with physical laws, and achieving good generalizability.