Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs
@article{Daulbaev2020InterpolationTT, title={Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs}, author={Talgat Daulbaev and A. Katrutsa and L. Markeeva and Julia Gusak and A. Cichocki and I. Oseledets}, journal={arXiv: Neural and Evolutionary Computing}, year={2020} }
We propose a simple interpolation-based method for the efficient approximation of gradients in neural ODE models. We compare it with the reverse dynamic method (known in the literature as "adjoint method") to train neural ODEs on classification, density estimation, and inference approximation tasks. We also propose a theoretical justification of our approach using logarithmic norm formalism. As a result, our method allows faster model training than the reverse dynamic method that was confirmed… Expand
One Citation
References
SHOWING 1-10 OF 45 REFERENCES
ANODE: Unconditionally Accurate Memory-Efficient Gradients for Neural ODEs
- Computer Science
- IJCAI
- 2019
- 60
- Highly Influential
- PDF
SNODE: Spectral Discretization of Neural ODEs for System Identification
- Computer Science
- ICLR
- 2020
- 20
- PDF
Neural Ordinary Differential Equations
- Computer Science, Mathematics
- NeurIPS
- 2018
- 938
- Highly Influential
- PDF
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
- Computer Science, Mathematics
- ICLR
- 2019
- 271
- Highly Influential
- PDF
Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations
- Computer Science, Mathematics
- ICML
- 2018
- 200
- PDF
Deep Neural Networks Motivated by Partial Differential Equations
- Computer Science, Mathematics
- Journal of Mathematical Imaging and Vision
- 2019
- 154
- PDF