Corpus ID: 203902337

ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

@article{Wang2019ECANetEC,
  title={ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks},
  author={Qilong Wang and Banggu Wu and Pengfei Zhu and Peihua Li and Wangmeng Zuo and Qinglei Hu},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.03151}
}
  • Qilong Wang, Banggu Wu, +3 authors Qinglei Hu
  • Published 2019
  • Computer Science
  • ArXiv
  • Channel attention has recently demonstrated to offer great potential in improving the performance of deep convolutional neural networks (CNNs. [...] Key Method In particular, we propose an Efficient Channel Attention (ECA) module, which only involves $k (k < 9)$ parameters but brings clear performance gain. By revisiting the channel attention module in SENet, we empirically show avoiding dimensionality reduction and appropriate cross-channel interaction are important to learn effective channel attention…Expand Abstract

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 16 CITATIONS

    HMANet: Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Sampled Training and Node Inheritance for Fast Evolutionary Neural Architecture Search

    VIEW 5 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Lossless Attention in Convolutional Networks for Facial Expression Recognition in the Wild

    VIEW 3 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Robust License Plate Recognition With Shared Adversarial Training Network

    VIEW 5 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Weakly Supervised Lesion Co-Segmentation on Ct Scans

    VIEW 2 EXCERPTS

    PRI Modulation Recognition Based on Squeeze-and-Excitation Networks

    VIEW 6 EXCERPTS
    CITES RESULTS
    HIGHLY INFLUENCED

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 50 REFERENCES

    Squeeze-and-Excitation Networks

    • Jie Hu, Li Shen, Samuel Albanie, Gang Sun, Enhua Wu
    • Computer Science, Medicine
    • IEEE Transactions on Pattern Analysis and Machine Intelligence
    • 2020

    A2-Nets: Double Attention Networks

    VIEW 9 EXCERPTS
    HIGHLY INFLUENTIAL

    CBAM: Convolutional Block Attention Module

    VIEW 9 EXCERPTS
    HIGHLY INFLUENTIAL

    clcNet: Improving the Efficiency of Convolutional Neural Network Using Channel Local Convolutions

    • Dong-Qing Zhang
    • Computer Science
    • 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
    • 2018
    VIEW 1 EXCERPT

    Factorized Bilinear Models for Image Recognition

    VIEW 1 EXCERPT

    Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups

    VIEW 2 EXCERPTS

    Attention Augmented Convolutional Networks

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL