Corpus ID: 219792967

A Shooting Formulation of Deep Learning

@article{Vialard2020ASF,
  title={A Shooting Formulation of Deep Learning},
  author={Franccois-Xavier Vialard and R. Kwitt and Susan Wei and M. Niethammer},
  journal={ArXiv},
  year={2020},
  volume={abs/2006.10330}
}
Continuous-depth neural networks can be viewed as deep limits of discrete neural networks whose dynamics resemble a discretization of an ordinary differential equation (ODE). Although important steps have been taken to realize the advantages of such continuous formulations, most current techniques are not truly continuous-depth as they assume identical layers. Indeed, existing works throw into relief the myriad difficulties presented by an infinite-dimensional parameter space in learning a… Expand
Depth-Adaptive Neural Networks from the Optimal Control viewpoint
Differentiable Multiple Shooting Layers
Diffeomorphic Learning
  • L. Younes
  • Computer Science, Mathematics
  • ArXiv
  • 2018

References

SHOWING 1-10 OF 47 REFERENCES
Neural Ordinary Differential Equations
OptNet: Differentiable Optimization as a Layer in Neural Networks
Stable Architectures for Deep Neural Networks
Deep Equilibrium Models
ANODEV2: A Coupled Neural ODE Framework
Maximum Principle Based Algorithms for Deep Learning
Deep Learning Theory Review: An Optimal Control and Dynamical Systems Perspective
Deep Neural Networks Motivated by Partial Differential Equations
How to Train Your Neural ODE: the World of Jacobian and Kinetic Regularization
...
1
2
3
4
5
...