Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows

  title={Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows},
  author={Tizian Wenzel and Marius Kurz and Andrea D. Beck and Gabriele Santin and Bernard Haasdonk},
Standard kernel methods for machine learning usually struggle when dealing with large datasets. We review a recently introduced Structured Deep Kernel Network (SDKN) approach that is capable of dealing with high-dimensional and huge datasets and enjoys typical standard machine learning approximation properties. We extend the SDKN to combine it with standard machine learning modules and compare it with Neural Networks on the scientific challenge of data-driven prediction of closure terms of… 

Universality and Optimality of Structured Deep Kernel Networks

A recent deep kernel representer theorem is leverage to connect the two approaches and understand their interplay, showing that the use of special types of kernels yield models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while enjoying many computational properties of deep neural networks.

Symplectic Model Reduction of Hamiltonian Systems on Nonlinear Manifolds

This work provides a novel projection technique called symplectic manifold Galerkin (SMG), which projects the Hamiltonian system onto a nonlinear symplectic trial manifold such that the reduced model is again a HamiltonianSystem, and derives analytical results such as stability, energy-preservation and a rigorous a-posteriori error bound.



A representer theorem for deep kernel learning

This paper provides a representer theorem for the concatenation of (linear combinations of) kernel functions of reproducing kernel Hilbert spaces and shows how concatenated machine learning problems can be reformulated as neural networks and how this result applies to a broad class of state-of-the-art deep learning methods.

A machine learning framework for LES closure terms

The present study can be seen as a starting point for the investigation of data-based modeling approaches for LES, which not only include the physical closure terms, but account for the discretization effects in implicitly filtered LES as well.

A perspective on machine learning methods in turbulence modeling

A survey of the current data‐driven model concepts and methods, highlight important developments, and put them into the context of the discussed challenges, mostly from the perspective of large Eddy simulation and related techniques.

Kernel Methods for Surrogate Modeling

This chapter deals with kernel methods as a special class of techniques for surrogate modeling, which are meshless, do not require or depend on a grid, hence are less prone to the curse of dimensionality, even for high-dimensional problems.

On the Properties of Neural Machine Translation: Encoder–Decoder Approaches

It is shown that the neural machine translation performs relatively well on short sentences without unknown words, but its performance degrades rapidly as the length of the sentence and the number of unknown words increase.

A Correspondence Between Bayesian Estimation on Stochastic Processes and Smoothing by Splines

Abstract : The report presents classes of prior distributions for which the Bayes' estimate of an unknown function given certain observations is a spline function. (Author)

Mathematical Principles of Classical Fluid Mechanics

Classical fluid mechanics is a branch of continuum mechanics; that is, it proceeds on the assumption that a fluid is practically continuous and homogeneous in structure. The fundamental property

Structured Deep Kernel Networks (2021), in preparation

  • 2021

Deep Learning

Deep neural networks for data-driven LES closure models

  • Journal of Computational Physics 398, 108910
  • 2019