Aggregated Residual Transformations for Deep Neural Networks

@article{Xie2016AggregatedRT,
  title={Aggregated Residual Transformations for Deep Neural Networks},
  author={Saining Xie and Ross B. Girshick and Piotr Doll{\'a}r and Zhuowen Tu and Kaiming He},
  journal={2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2016},
  pages={5987-5995}
}
We present a simple, highly modularized network architecture for image classification. Our network is constructed by repeating a building block that aggregates a set of transformations with the same topology. Our simple design results in a homogeneous, multi-branch architecture that has only a few hyper-parameters to set. This strategy exposes a new dimension, which we call cardinality (the size of the set of transformations), as an essential factor in addition to the dimensions of depth and… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 1,138 CITATIONS, ESTIMATED 100% COVERAGE

CBNet: A Novel Composite Backbone Network Architecture for Object Detection

Yudong Liu, Yongtao Wang, +4 authors Haibin Ling
  • ArXiv
  • 2019
VIEW 7 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Deep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound

  • IEEE Transactions on Medical Imaging
  • 2019
VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Dynamic Multi-path Neural Network

VIEW 6 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Feature Selective Anchor-Free Module for Single-Shot Object Detection

VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FoodAI: Food Image Recognition via Deep Learning for Smart Food Logging

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Greedy AutoAugment

VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Late Fusion via Subspace Search With Consistency Preservation

  • IEEE Transactions on Image Processing
  • 2019
VIEW 8 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Libra R-CNN: Towards Balanced Learning for Object Detection

  • CVPR
  • 2019
VIEW 4 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 255 Highly Influenced Citations

  • Averaged 377 Citations per year from 2017 through 2019

  • 16% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 40 REFERENCES

Inception-v4

C. Szegedy, S. Ioffe, V. Vanhoucke
  • inception-resnet and the impact of residual connections on learning. In ICLR Workshop
  • 2016
VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

Training and investigating Residual Nets

S. Gross, M. Wilber
  • https://github.com/ facebook/fb.resnet.torch
  • 2016
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

ImageNet Large Scale Visual Recognition Challenge

  • International Journal of Computer Vision
  • 2014
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

A

N. Kalchbrenner, L. Espeholt, K. Simonyan
  • v. d. Oord, A. Graves, and K. Kavukcuoglu. Neural machine translation in linear time. arXiv:1610.10099
  • 2016
VIEW 1 EXCERPT

Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups

  • 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2016
VIEW 3 EXCERPTS