DelugeNets: Deep Networks with Efficient and Flexible Cross-Layer Information Inflows

@article{Kuen2017DelugeNetsDN,
  title={DelugeNets: Deep Networks with Efficient and Flexible Cross-Layer Information Inflows},
  author={Jason Kuen and Xiangfei Kong and G. Wang and Yap-Peng Tan},
  journal={2017 IEEE International Conference on Computer Vision Workshops (ICCVW)},
  year={2017},
  pages={958-966}
}
Deluge Networks (DelugeNets) are deep neural networks which efficiently facilitate massive cross-layer information inflows from preceding layers to succeeding layers. The connections between layers in DelugeNets are established through cross-layer depthwise convolutional layers with learnable filters, acting as a flexible yet efficient selection mechanism. DelugeNets can propagate information across many layers with greater flexibility and utilize network parameters more effectively compared to… Expand
ADNet: Adaptively Dense Convolutional Neural Networks
TLDR
This paper presents a layer attention based Adaptively Dense Network (ADNet) by adaptively determining the reuse status of hierarchical preceding features by adaptingively determiningThe reuse status is adapted to fuse multi-level internal representations in an effective manner. Expand
Deep convolutional neural network based on densely connected squeeze-and-excitation blocks
  • Yu Wu
  • Computer Science
  • AIP Advances
  • 2019
TLDR
The method proposed in this paper is based on the idea of a residual network, which shows that if a fast connection is added between network layers, the network will be deeper, the accuracy will be higher, and the training will be more efficient. Expand
Generalizing Deep Models for Overhead Image Segmentation Through Getis-Ord Gi* Pooling
TLDR
A novel feature pooling method for convolutional neural networks using Getis-Ord Gi* analysis from geostatistics that exploits the fact that there are certain fundamental rules as to how things are distributed on the surface of the Earth and these rules do not vary substantially between locations. Expand
Convolutional Neural Networks combined with Runge-Kutta Methods
A convolutional neural network for image classification can be constructed mathematically since it is inspired by the ventral stream in visual cortex which can be regarded as a multi-period dynamicalExpand
S-DenseNet: A DenseNet Compression Model Based on Convolution Grouping Strategy Using Skyline Method
TLDR
S-DenseNet is proposed, a compact model of DenseNet which makes the extracting features of DensingNet more comprehensively and reduces the parameter redundancy at the same time, and achieves the higher or similar Top-1 accuracy with less complexity on ImageNet dataset. Expand
Convolutional Neural Networks combined with Runge-Kutta Methods
TLDR
The success of the experiments denotes that Runge-Kutta methods can be utilized to construct convolutional neural networks for image classification efficiently and the network models might be structured more rationally in the future basing on RKNet and priori knowledge. Expand
A survey of the recent architectures of deep convolutional neural networks
TLDR
This survey focuses on the intrinsic taxonomy present in the recently reported deep CNN architectures and classifies the recent innovations in CNN architectures into seven different categories, based on spatial exploitation, depth, multi-path, width, feature-map exploitation, channel boosting, and attention. Expand
Non-linear Convolution Filters for CNN-Based Learning
TLDR
This work addresses the issue of developing a convolution method in the context of a computational model of the visual cortex, exploring quadratic forms through the Volterra kernels, and shows that a network which combines linear and non-linear filters in its convolutional layers, can outperform networks that use standard linear filters with the same architecture. Expand
TheLNet270v1 – A Novel Deep-Network Architecture for the Automatic Classification of Thermal Images for Greenhouse Plants
TLDR
This research aimed to develop a deep learning technique for the automatic segmentation of thermal images to continuously monitor the canopy surface temperature inside a greenhouse with higher accuracy than previous systems. Expand
Classification of tumor type from histopathological images
In recent years, machine learning has been used increasingly more often in most areas of science and engineering. One such area is the analysis of data from the medical environment. This masterExpand
...
1
2
...

References

SHOWING 1-10 OF 42 REFERENCES
Wide Residual Networks
TLDR
This paper conducts a detailed experimental study on the architecture of ResNet blocks and proposes a novel architecture where the depth and width of residual networks are decreased and the resulting network structures are called wide residual networks (WRNs), which are far superior over their commonly used thin and very deep counterparts. Expand
Densely Connected Convolutional Networks
TLDR
The Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion, and has several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. Expand
Swapout: Learning an ensemble of deep architectures
TLDR
This work describes Swapout, a new stochastic training method that outperforms ResNets of identical network structure yielding impressive results on CIFAR-10 and CIFar-100 and proposes a parameterization that reveals connections to exiting architectures and suggests a much richer set of architectures to be explored. Expand
Deep Networks with Stochastic Depth
TLDR
Stochastic depth is proposed, a training procedure that enables the seemingly contradictory setup to train short networks and use deep networks at test time and reduces training time substantially and improves the test error significantly on almost all data sets that were used for evaluation. Expand
Deep Residual Networks with Exponential Linear Unit
TLDR
This paper proposes to replace the combination of ReLU and Batch Normalization with Exponential Linear Unit (ELU) in Residual Networks, and shows that this not only speeds up the learning behavior in Residine Networks, but also improves the classification performance as the depth increases. Expand
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
TLDR
This work proposes a Parametric Rectified Linear Unit (PReLU) that generalizes the traditional rectified unit and derives a robust initialization method that particularly considers the rectifier nonlinearities. Expand
Highway and Residual Networks learn Unrolled Iterative Estimation
TLDR
It is demonstrated that an alternative viewpoint based on unrolled iterative estimation -- a group of successive layers iteratively refine their estimates of the same features instead of computing an entirely new representation leads to the construction of Highway and Residual networks. Expand
Network In Network
TLDR
With enhanced local modeling via the micro network, the proposed deep network structure NIN is able to utilize global average pooling over feature maps in the classification layer, which is easier to interpret and less prone to overfitting than traditional fully connected layers. Expand
Identity Mappings in Deep Residual Networks
TLDR
The propagation formulations behind the residual building blocks suggest that the forward and backward signals can be directly propagated from one block to any other block, when using identity mappings as the skip connections and after-addition activation. Expand
Going deeper with convolutions
We propose a deep convolutional neural network architecture codenamed Inception that achieves the new state of the art for classification and detection in the ImageNet Large-Scale Visual RecognitionExpand
...
1
2
3
4
5
...