# Using Machine Learning to Augment Coarse-Grid Computational Fluid Dynamics Simulations

@article{Pathak2020UsingML, title={Using Machine Learning to Augment Coarse-Grid Computational Fluid Dynamics Simulations}, author={Jaideep Pathak and Mustafa Mustafa and Karthik Kashinath and Emmanuel Motheau and Thorsten Kurth and Marcus S. Day}, journal={ArXiv}, year={2020}, volume={abs/2010.00072} }

Simulation of turbulent flows at high Reynolds number is a computationally challenging task relevant to a large number of engineering and scientific applications in diverse fields such as climate science, aerodynamics, and combustion. Turbulent flows are typically modeled by the Navier-Stokes equations. Direct Numerical Simulation (DNS) of the Navier-Stokes equations with sufficient numerical resolution to capture all the relevant scales of the turbulent motions can be prohibitively expensive…

## 29 Citations

### Machine learning–accelerated computational fluid dynamics

- Computer ScienceProceedings of the National Academy of Sciences
- 2021

It is shown that using machine learning inside traditional fluid simulations can improve both accuracy and speed, even on examples very different from the training data, which opens the door to applying machine learning to large-scale physical modeling tasks like airplane design and climate prediction.

### Learned discretizations for passive scalar advection in a two-dimensional turbulent flow

- Computer Science
- 2020

A machine learning approach is used to learn a numerical discretization that retains high accuracy even when the solution is under-resolved with classical methods to solve passive scalar advection in a two-dimensional turbulent flow.

### Learned Coarse Models for Efficient Turbulence Simulation

- Computer ScienceArXiv
- 2021

Broadly, the proposed model can simulate turbulent dynamics more accurately than classical numerical solvers at the comparably low resolutions across various scientiﬁcally relevant metrics.

### Multiscale Neural Operator: Learning Fast and Grid-independent PDE Solvers

- Computer ScienceArXiv
- 2022

This work proposes a hybrid, ﬂexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of ﬁne- onto large- scale dynamics.

### Machine learning accelerated particle-in-cell plasma simulations

- Computer Science, Physics
- 2021

This work investigates how amortized solvers can be incorporated with PIC methods for simulations of plasmas and finds that this approach reduces the average number of required solver iterations by about 25% when simulating electron plasma oscillations.

### L EARNING GENERAL - PURPOSE CNN- BASED SIMULA TORS FOR ASTROPHYSICAL TURBULENCE

- Computer Science
- 2021

It is found that the learned models outperform coarsened solvers on certain metrics, particularly in their ability to preserve high-frequency information at low resolution, and describe ways to improve generalization beyond the training distribution.

### Stable a posteriori LES of 2D turbulence using convolutional neural networks: Backscattering analysis and generalization to higher Re via transfer learning

- Computer ScienceJ. Comput. Phys.
- 2022

### Meta-PDE: Learning to Solve PDEs Quickly Without a Mesh

- Computer ScienceArXiv
- 2022

A meta-learning based method which learns to rapidly solve problems from a distribution of related PDEs, which can be trained without supervision from expensive ground-truth data, does not require a mesh, and can even be used when the geometry and topology varies between tasks.

### TNT: Vision Transformer for Turbulence Simulations

- Computer Science
- 2022

The Turbulence Neural Transformer (TNT), which is a learned simulator based on the transformer architecture, to predict turbulent dynamics on coarsened grids, and it is shown that TNT outperforms the state-of-the-art U-net simulator on several metrics.

### Error-Correcting Neural Networks for Semi-Lagrangian Advection in the Level-Set Method

- Computer ScienceJ. Comput. Phys.
- 2022

## References

SHOWING 1-10 OF 31 REFERENCES

### I and i

- EducationBMJ : British Medical Journal
- 2001

There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.

### Physical Review Research 2

- 023068
- 2020

### and T

- Brox, in International Conference on Medical image computing and computer-assisted intervention
- 2015

### In Advances in Neural Information Processing Systems 15

- Computer ScienceNIPS 1991
- 1991

### Geophysical Research Letters 47

- e2020GL087776
- 2020

### Physics of Fluids 32

- 025105
- 2020

### Geoscientific Model Development 11

- 3999
- 2018