High-order differentiable autoencoder for nonlinear model reduction

@article{Shen2021HighorderDA,
  title={High-order differentiable autoencoder for nonlinear model reduction},
  author={Siyuan Shen and Yin Yang and Tianjia Shao and He Wang and Chenfanfu Jiang and Lei Lan and Kun Zhou},
  journal={ACM Transactions on Graphics (TOG)},
  year={2021},
  volume={40},
  pages={1 - 15}
}
This paper provides a new avenue for exploiting deep neural networks to improve physics-based simulation. Specifically, we integrate the classic Lagrangian mechanics with a deep autoencoder to accelerate elastic simulation of deformable solids. Due to the inertia effect, the dynamic equilibrium cannot be established without evaluating the second-order derivatives of the deep autoencoder network. This is beyond the capability of off-the-shelf automatic differentiation packages and algorithms… 
Model reduction for the material point method via learning the deformation map and its spatial-temporal gradients
TLDR
This work proposes a model-reduction approach for the material point method on nonlinear manifolds that supports resolution changes in the reduced simulation, attaining the challenging task of zero-shot super-resolution by generating material points unseen in the training data.
Neural Jacobian Fields: Learning Intrinsic Mappings of Arbitrary Meshes
TLDR
By operating in the intrinsic gradient domain of each individual mesh, the framework allows the framework to predict highly-accurate mappings and exhibit the high accuracy of the method as well as its versatility, as it is readily applied to the above scenarios without any changes to the framework.
Fine-grained differentiable physics: a yarn-level model for fabrics
Differentiable physics modeling combines physics models with gradient-based learning to provide model explicability and data efficiency. It has been used to learn dynamics, solve inverse problems and…
Model reduction for the material point method via an implicit neural representation of the deformation map
TLDR
A model-reduction approach for the material point method on nonlinear manifolds that approximates the kinematics by approximating the deformation map using an implicit neural representation that restricts deformation trajectories to reside on a low-dimensional manifold via optimal-projection-based dynamics.

References

SHOWING 1-10 OF 85 REFERENCES
Higher Order Contractive Auto-Encoder
TLDR
A novel regularizer when training an autoencoder for unsupervised feature extraction yields representations that are significantly better suited for initializing deep architectures than previously proposed approaches, beating state-of-the-art performance on a number of datasets.
Descent methods for elastic body simulation on the GPU
TLDR
A new gradient descent method using Jacobi preconditioning and Chebyshev acceleration is proposed, comparable to that of L-BFGS or nonlinear conjugate gradient, but unlike other methods, it requires no dot product operation, making it suitable for GPU implementation.
Latent‐space Dynamics for Reduced Deformable Simulation
TLDR
This work proposes the first reduced model simulation framework for deformable solid dynamics using autoencoder neural networks and solves the true equations of motion in the latent‐space using a variational formulation of implicit integration.
Data-driven fluid simulations using regression forests
TLDR
This paper proposes a novel machine learning based approach, that formulates physics-based fluid simulation as a regression problem, estimating the acceleration of every particle for each frame, and designed a feature vector, directly modelling individual forces and constraints from the Navier-Stokes equations.
Accelerated complex-step finite difference for expedient deformable simulation
TLDR
This paper grafts a new finite difference scheme, namely the complex-step finite difference (CSFD), with physics-based animation and demonstrates the accuracy, convenience, and efficiency of this new numerical routine in the context of deformable simulation.
Subspace fluid re-simulation
TLDR
This paper proposes a novel importance sampling-based fitting algorithm that asymptotically accelerates the precomputation stage, and shows that the Iterated Orthogonal Projection method can be used to elegantly incorporate moving internal boundaries into a subspace simulation.
Contractive Auto-Encoders: Explicit Invariance During Feature Extraction
TLDR
It is found empirically that this penalty helps to carve a representation that better captures the local directions of variation dictated by the data, corresponding to a lower-dimensional non-linear manifold, while being more invariant to the vast majority of directions orthogonal to the manifold.
Latent Space Physics: Towards Learning the Temporal Evolution of Fluid Flow
TLDR
It is demonstrated for the first time that dense 3D+time functions of physics system can be predicted within the latent spaces of neural networks, and the method arrives at a neural‐network based simulation algorithm with significant practical speed‐ups.
Incremental Deformation Subspace Reconstruction
TLDR
It is shown that the subspace of a modified body can be efficiently obtained from the sub space of its original version, if mesh changes are small, and a hybrid approach to calculate modal derivatives from both new and original linear modes is presented.
Updated sparse cholesky factors for corotational elastodynamics
We present warp-canceling corotation, a nonlinear finite element formulation for elastodynamic simulation that achieves fast performance by making only partial or delayed changes to the simulation's…
...
...