• Corpus ID: 244920615

Emulating Spatio-Temporal Realizations of Three-Dimensional Isotropic Turbulence via Deep Sequence Learning Models

  title={Emulating Spatio-Temporal Realizations of Three-Dimensional Isotropic Turbulence via Deep Sequence Learning Models},
  author={Mohammadreza Momenifar and Enmao Diao and Vahid Tarokh and Andrew D. Bragg},
We use a data-driven approach to model a three-dimensional turbulent flow using cutting-edge Deep Learning techniques. The deep learning framework incorporates physical constraints on the flow, such as preserving incompressibility and global statistical invariants of velocity gradient tensor. The accuracy of the model is assessed using statistical and physics-based metrics. The data set comes from Direct Numerical Simulation of an incompressible, statistically stationary, isotropic turbulent… 

Figures from this paper


Dimension reduced turbulent flow data from deep vector quantisers
A physics-informed Deep Learning technique based on vector quantisation is applied to generate a discrete, low-dimensional representation of data from simulations of three-dimensional turbulent flows to provide an attractive solution for situations where fast, high quality and low-overhead encoding and decoding of large data are required.
Wavelet-Powered Neural Networks for Turbulence
This paper takes advantage of the balance between strong mathematical foundations and the physical interpretability of wavelet theory to build a spatio-temporally reduced dynamical map which fuses wavelet based spatial decomposition with spatIO-temporal modeling based on Convolutional Long Short Term Memory (C-LSTM) architecture.
Compressed Convolutional LSTM: An Efficient Deep Learning framework to Model High Fidelity 3D Turbulence
High-fidelity modeling of turbulent flows is one of the major challenges in computational physics, with diverse applications in engineering, earth sciences and astrophysics, among many others. The
Towards Physics-informed Deep Learning for Turbulent Flow Prediction
This paper proposes a hybrid approach to predict turbulent flow by learning its highly nonlinear dynamics from spatiotemporal velocity fields of large-scale fluid flow simulations of relevance to turbulence modeling and climate modeling by marrying two well-established turbulent flow simulation techniques with deep learning.
Subgrid modelling for two-dimensional turbulence using neural networks
The proposed methodology successfully establishes a map between inputs given by stencils of the vorticity and the streamfunction along with information from two well-known eddy-viscosity kernels, which represents a promising development in the formalization of a framework for generation of heuristic-free turbulence closures from data.
Super-resolution reconstruction of turbulent flows with machine learning
We use machine learning to perform super-resolution analysis of grossly under-resolved turbulent flow field data to reconstruct the high-resolution flow field. Two machine learning models are
Reynolds averaged turbulence modelling using deep neural networks with embedded invariance
This paper presents a method of using deep neural networks to learn a model for the Reynolds stress anisotropy tensor from high-fidelity simulation data and proposes a novel neural network architecture which uses a multiplicative layer with an invariant tensor basis to embed Galilean invariance into the predicted anisotropic tensor.
Deep learning for in situ data compression of large turbulent flow simulations
A deep learning approach to in-situ compression using a novel autoencoder architecture customized for three-dimensional turbulent flows is examined and improved compression and reconstruction is demonstrated, particularly with respect to important statistical quantities.
Fourier Neural Operator for Parametric Partial Differential Equations
This work forms a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture and shows state-of-the-art performance compared to existing neural network methodologies.
Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting
This paper proposes the convolutional LSTM (ConvLSTM) and uses it to build an end-to-end trainable model for the precipitation nowcasting problem and shows that it captures spatiotemporal correlations better and consistently outperforms FC-L STM and the state-of-the-art operational ROVER algorithm.