• Corpus ID: 231985467

Neural Delay Differential Equations

@article{Zhu2021NeuralDD,
  title={Neural Delay Differential Equations},
  author={Qunxi Zhu and Yao Guo and Wei Lin},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.10801}
}
Neural Ordinary Differential Equations (NODEs), a framework of continuousdepth neural networks, have been widely applied, showing exceptional efficacy in coping with some representative datasets. Recently, an augmented framework has been successfully developed for conquering some limitations emergent in application of the original framework. Here we propose a new class of continuous-depth neural networks with delay, named as Neural Delay Differential Equations (NDDEs), and, for computing the… 

Neural Piecewise-Constant Delay Differential Equations

TLDR
It is shown that the Neural PCDDEs do outperform the several existing continuous-depth neural frameworks on the one-dimensional piecewise-constant delay population dynamics and real-world datasets, including MNIST, CIFAR10, and SVHN.

Neural Laplace: Learning diverse classes of differential equations in the Laplace domain

TLDR
This work proposes Neural Laplace, a unified framework for learning diverse classes of DEs including all the aforementioned ones, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials in the Laplace domain.

Delay Differential Neural Networks

TLDR
This paper proposes a novel model, delay differential neural networks (DDNN), inspired by delay differential equations (DDEs), which considers the derivative of the hidden feature vector as a function of the current feature vector and past feature vectors (history).

Characteristic Neural Ordinary Differential Equations

TLDR
It is proved that the C-NODE framework extends the classical NODE on classification tasks by demonstrating explicit C- NODE representable functions not expressible by NODEs, and empirical evidence that the learned curves improve the efficiency of the system through a lower number of parameters and function evaluations compared with baselines.

Tractable Dendritic RNNs for Reconstructing Nonlinear Dynamical Systems

TLDR
This workMotivated by the emerg-ing principles of dendritic computation, it is shown that the dendritically expanded PLRNN achieves better reconstructions with fewer parameters and dimensions on various dynamical systems benchmarks and compares favorably to other methods, while retaining a tractable and interpretable structure.

Learning Time Delay Systems with Neural Ordinary Differential Equations

TLDR
A novel way of using neural networks to learn the dynamics of time delay systems from sequential data using data from chaotic behavior is proposed and it is demonstrated that the bifurcation diagram of the neural network matches that of the original system.

Stateful ODE-Nets using Basis Function Expansions

TLDR
This work reconsider formulations of the weights as continuous-in-depth functions using linear combinations of basis functions which enables them to leverage parameter transformations such as function projections to formulate a novel stateful ODE-Block that handles stateful layers.

Compressing Deep ODE-Nets using Basis Function Expansions

TLDR
This work reconsider formulations of the weights as continuous-depth functions using linear combinations of basis functions to compress the weights through a change of basis, without retraining, while maintaining near state-of-the-art performance.

Machine learning for closure models

  • Computer Science
  • 2022
TLDR
The value of ML for closure terms is clear since the accuracy of a numerical method can be improved significantly by supplementing the method with a relatively small neural network, but more research should be done to examine the performance of such closure terms compared to purely numerical methods.

Beyond Predictions in Neural ODEs: Identification and Interventions

TLDR
It is shown that combining simple regularization schemes with flexible neural ODEs can robustly recover the dynamics and causal structures from time-series data and can also make accurate predictions under interventions on variables or the system itself.

References

SHOWING 1-10 OF 39 REFERENCES

ANODEV2: A Coupled Neural ODE Framework

TLDR
Results are reported showing that the coupled ODE-based framework is indeed trainable, and that it achieves higher accuracy, compared to the baseline ResNet network and the recently-proposed Neural ODE approach.

Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations

TLDR
It is shown that many effective networks, such as ResNet, PolyNet, FractalNet and RevNet, can be interpreted as different numerical discretizations of differential equations and established a connection between stochastic control and noise injection in the training process which helps to improve generalization of the networks.

Neural Ordinary Differential Equations

TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Deep Neural Networks Motivated by Partial Differential Equations

TLDR
A new PDE interpretation of a class of deep convolutional neural networks (CNN) that are commonly used to learn from speech, image, and video data is established and three new ResNet architectures are derived that fall into two new classes: parabolic and hyperbolic CNNs.

Solving high-dimensional partial differential equations using deep learning

TLDR
A deep learning-based approach that can handle general high-dimensional parabolic PDEs using backward stochastic differential equations and the gradient of the unknown solution is approximated by neural networks, very much in the spirit of deep reinforcement learning with the gradient acting as the policy function.

SNODE: Spectral Discretization of Neural ODEs for System Identification

TLDR
Experimental comparison to standard methods, such as backpropagation through explicit solvers and the adjoint technique, on training surrogate models of small and medium-scale dynamical systems shows that it is at least one order of magnitude faster at reaching a comparable value of the loss function.

PDE-Net: Learning PDEs from Data

TLDR
Numerical experiments show that the PDE-Net has the potential to uncover the hidden PDE of the observed dynamics, and predict the dynamical behavior for a relatively long time, even in a noisy environment.

NeuPDE: Neural Network Based Ordinary and Partial Differential Equations for Modeling Time-Dependent Data

TLDR
A neural network based approach for extracting models from dynamic data using ordinary and partial differential equations and shows that for MNIST and Fashion MNIST, the approach lowers the parameter cost as compared to other deep neural networks.

Augmented Neural ODEs

TLDR
Augmented Neural ODEs are introduced which, in addition to being more expressive models, are empirically more stable, generalize better and have a lower computational cost than Neural Odes.