• Corpus ID: 246706167

Dimensionally Consistent Learning with Buckingham Pi

  title={Dimensionally Consistent Learning with Buckingham Pi},
  author={Joseph Bakarji and Jared L. Callaham and Steven L. Brunton and J. Nathan Kutz},
In the absence of governing equations, dimensional analysis is a robust technique for extracting insights and finding symmetries in physical systems. Given measurement variables and parameters, the Buckingham Pi theorem provides a procedure for finding a set of dimensionless groups that spans the solution space, although this set is not unique. We propose an automated approach using the symmetric and self-similar structure of available measurement data to discover the dimensionless groups that… 

Figures and Tables from this paper

Dimensionless machine learning: Imposing exact units equivariance
The approach can be used to impose units equivariance across a broad range of machine learning methods which are equivariant to rotations and other groups and discusses the in-sample and out-of-sample prediction accuracy gains one can obtain in contexts like symbolic regression and emulation, where symmetry is important.
Discrepancy Modeling Framework: Learning missing physics, modeling systematic residuals, and disambiguating between deterministic and random effects
This work introduces a discrepancy modeling framework to resolve deterministic model-measurement mismatch with two distinct approaches: (i) by learning a model for the evolution of systematic state-space residual, and (ii) by discovering a models for the missing deterministic physics.


Constrained sparse Galerkin regression
The sparse identification of nonlinear dynamics (SINDy) is a recently proposed data-driven modelling framework that uses sparse regression techniques to identify nonlinear low-order models and extends it to enforce physical constraints in the regression, e.g. energy-preserving quadratic nonlinearities.
Data-driven discovery of dimensionless numbers and scaling laws from experimental measurements
This study embeds the principle of dimensional invariance into a two-level machine learning scheme to automatically discover dominant and unique dimensionless numbers and scaling laws from data, and shows that the proposed approach can identify dimensionally homogeneous differential equations with minimal parameters by leveraging sparsity-promoting techniques.
Learning normal form autoencoders for data-driven discovery of universal, parameter-dependent governing equations
This work introduces deep learning autoencoders to discover coordinate transformations that capture the underlying parametric dependence of a dynamical system in terms of its canonical normal form, allowing for a simple representation of the parametric dependency and bifurcation structure.
Discovering Governing Equations from Partial Measurements with Deep Delay Autoencoders
It is shown that it is possible to simultaneously learn a closedform model and the associated coordinate system for partially observed dynamics, and this framework combines deep learning to uncover effective coordinates and the sparse identification of nonlinear dynamics (SINDy) for interpretable modeling.
Reconstruction of normal forms by learning informed observation geometries from data
A geometric/analytic learning algorithm capable of creating minimal descriptions of parametrically dependent unknown nonlinear dynamical systems and an informed observation geometry that enables us to formulate models without first principles as well as without closed form equations are discussed.
Data-driven discovery of coordinates and governing equations
A custom deep autoencoder network is designed to discover a coordinate transformation into a reduced space where the dynamics may be sparsely represented, and the governing equations and the associated coordinate system are simultaneously learned.
Data-driven discovery of partial differential equations
The sparse regression method is computationally efficient, robust, and demonstrated to work on a variety of canonical problems spanning a number of scientific domains including Navier-Stokes, the quantum harmonic oscillator, and the diffusion equation.
Learning dominant physical processes with data-driven balance models.
This work automates and generalizes the approach to non-asymptotic regimes by introducing the idea of an equation space, in which different local balances appear as distinct subspace clusters, and shows that this approach uncovers key mechanistic models in turbulence, combustion, nonlinear optics, geophysical fluids, and neuroscience.
Improving neural network models of physical systems through dimensional analysis