• Corpus ID: 247614679

Reduced order modeling for flow and transport problems with Barlow Twins self-supervised learning

@inproceedings{Kadeethum2022ReducedOM,
  title={Reduced order modeling for flow and transport problems with Barlow Twins self-supervised learning},
  author={Teeratorn Kadeethum and Francesco Ballarin and Daniel O'Malley and Youngsoo Choi and Nikolaos Bouklas and Hongkyu Yoon},
  year={2022}
}
We propose a unified data-driven reduced order model (ROM) that bridges the performance gap between linear and nonlinear manifold approaches. Deep learning ROM (DL-ROM) using deep-convolutional autoencoders (DC-AE) has been shown to capture nonlinear solution manifolds but fails to perform adequately when linear subspace approaches such as proper orthogonal decomposition (POD) would be optimal. Besides, most DL-ROM models rely on convolutional layers, which might limit its application to only a… 
1 Citations

Figures and Tables from this paper

gLaSDI: Parametric Physics-informed Greedy Latent Space Dynamics Identification
TLDR
The proposed adaptive greedy sampling algorithm integrated with a physics-informed residual-based error indicator and random-subset evaluation is introduced to search for the optimal training samples on-the-fly and outperforms the con-ventional predefined uniform sampling in terms of accuracy.

References

SHOWING 1-10 OF 76 REFERENCES
A comprehensive deep learning-based approach to reduced order modeling of nonlinear time-dependent parametrized PDEs
TLDR
Numerical results indicate that DL-ROMs whose dimension is equal to the intrinsic dimensionality of the PDE solutions manifold are able to efficiently approximate the solution of parametrized PDEs, especially in cases for which a huge number of POD modes would have been necessary to achieve the same degree of accuracy.
Nonlinear proper orthogonal decomposition for convection-dominated flows
TLDR
A nonlinear proper orthogonal decomposition (POD) framework, which is an end-to-end Galerkin-free model combining autoencoders with long short-term memory networks for dynamics, which not only improves the accuracy, but also significantly reduces the computational cost of training and testing.
A framework for data-driven solution and parameter estimation of PDEs using conditional generative adversarial networks
TLDR
This work is the first to employ and adapt the image-to-image translation concept based on conditional generative adversarial networks towards learning a forward and an inverse solution operator of partial differential equations (PDEs), and provides a speed-up of 120000 times compared to a Gaussian priorbased inverse modeling approach while also delivering more accurate inverse results.
Barlow Twins: Self-Supervised Learning via Redundancy Reduction
TLDR
This work proposes an objective function that naturally avoids collapse by measuring the cross-correlation matrix between the outputs of two identical networks fed with distorted versions of a sample, and making it as close to the identity matrix as possible.
Efficient nonlinear manifold reduced order model
TLDR
An efficient nonlinear manifold ROM (NM-ROM) is developed, which can better approximate high-fidelity model solutions with a smaller latent space dimension than the LS-ROMs and shows that neural networks can learn a more efficient latent space representation on advection-dominated data from 2D Burgers' equations with a high Reynolds number.
...
...