Corpus ID: 226227575

Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs

@article{Daulbaev2020InterpolationTT,
  title={Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs},
  author={Talgat Daulbaev and A. Katrutsa and L. Markeeva and Julia Gusak and A. Cichocki and I. Oseledets},
  journal={arXiv: Neural and Evolutionary Computing},
  year={2020}
}
We propose a simple interpolation-based method for the efficient approximation of gradients in neural ODE models. We compare it with the reverse dynamic method (known in the literature as "adjoint method") to train neural ODEs on classification, density estimation, and inference approximation tasks. We also propose a theoretical justification of our approach using logarithmic norm formalism. As a result, our method allows faster model training than the reverse dynamic method that was confirmed… Expand
1 Citations

Figures from this paper

Neural Closure Models for Dynamical Systems
  • PDF

References

SHOWING 1-10 OF 45 REFERENCES
ANODE: Unconditionally Accurate Memory-Efficient Gradients for Neural ODEs
  • 60
  • Highly Influential
  • PDF
ANODEV2: A Coupled Neural ODE Evolution Framework
  • 17
  • PDF
SNODE: Spectral Discretization of Neural ODEs for System Identification
  • 20
  • PDF
Neural Ordinary Differential Equations
  • 938
  • Highly Influential
  • PDF
Augmented Neural ODEs
  • 131
  • PDF
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
  • 271
  • Highly Influential
  • PDF
Stochastic Gradient VB and the Variational Auto-Encoder
  • 172
  • PDF
Towards Understanding Normalization in Neural ODEs
  • 5
  • PDF
Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations
  • 200
  • PDF
Deep Neural Networks Motivated by Partial Differential Equations
  • 154
  • PDF
...
1
2
3
4
5
...