Evaluation of Continuous Image Features Learned by ODE Nets

@inproceedings{Carrara2019EvaluationOC,
  title={Evaluation of Continuous Image Features Learned by ODE Nets},
  author={Fabio Carrara and Giuseppe Amato and F. Falchi and Claudio Gennaro},
  booktitle={ICIAP},
  year={2019}
}
Deep-learning approaches in data-driven modeling relies on learning a finite number of transformations (and representations) of the data that are structured in a hierarchy and are often instantiated as deep neural networks (and their internal activations). State-of-the-art models for visual data usually implement deep residual learning: the network learns to predict a finite number of discrete updates that are applied to the internal network state to enrich it. Pushing the residual learning… Expand
1 Citations
Continuous ODE-defined Image Features for Adaptive Retrieval
TLDR
This work experiments with the recently proposed continuous neural networks defined by parametric ordinary differential equations, dubbed ODE-Nets, for adaptive extraction of image representations to approximate the exact feature extraction by taking a previous "near-in-time" hidden state as features with a reduced computational cost. Expand

References

SHOWING 1-10 OF 37 REFERENCES
Stable Architectures for Deep Neural Networks
TLDR
New forward propagation techniques inspired by systems of Ordinary Differential Equations (ODE) are proposed that overcome this challenge and lead to well-posed learning problems for arbitrarily deep networks. Expand
Neural Ordinary Differential Equations
TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models. Expand
Reversible Architectures for Arbitrarily Deep Residual Neural Networks
TLDR
From this interpretation, a theoretical framework on stability and reversibility of deep neural networks is developed, and three reversible neural network architectures that can go arbitrarily deep in theory are derived. Expand
Deep Neural Networks Motivated by Partial Differential Equations
TLDR
A new PDE interpretation of a class of deep convolutional neural networks (CNN) that are commonly used to learn from speech, image, and video data is established and three new ResNet architectures are derived that fall into two new classes: parabolic and hyperbolic CNNs. Expand
Identity Mappings in Deep Residual Networks
TLDR
The propagation formulations behind the residual building blocks suggest that the forward and backward signals can be directly propagated from one block to any other block, when using identity mappings as the skip connections and after-addition activation. Expand
Convolutional Networks with Adaptive Inference Graphs
TLDR
This work proposes convolutional networks with adaptive inference graphs (ConvNet-AIG) that adaptively define their network topology conditioned on the input image that shows a higher robustness than ResNets, complementing other known defense mechanisms. Expand
Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations
TLDR
It is shown that many effective networks, such as ResNet, PolyNet, FractalNet and RevNet, can be interpreted as different numerical discretizations of differential equations and established a connection between stochastic control and noise injection in the training process which helps to improve generalization of the networks. Expand
End-to-End Learning of Deep Visual Representations for Image Retrieval
TLDR
This article uses a large-scale but noisy landmark dataset and develops an automatic cleaning method that produces a suitable training set for deep retrieval, and builds on the recent R-MAC descriptor, which can be interpreted as a deep and differentiable architecture, and presents improvements to enhance it. Expand
From generic to specific deep representations for visual recognition
TLDR
This paper thoroughly investigates the transferability of ConvNet representations w.r.t. several factors, and shows that different visual recognition tasks can be categorically ordered based on their distance from the source task. Expand
Multi-level Residual Networks from Dynamical Systems View
TLDR
This paper adopts the dynamical systems point of view, and analyzes the lesioning properties of ResNet both theoretically and experimentally, and proposes a novel method for accelerating ResNet training. Expand
...
1
2
3
4
...