Corpus ID: 85498136

Decomposed Attention: Self-Attention with Linear Complexities

@inproceedings{Shen2018DecomposedAS,
  title={Decomposed Attention: Self-Attention with Linear Complexities},
  author={Zhuoran Shen and Mingyuan Zhang and Haiyu Zhao and Shuai Yi and Hongsheng Li},
  year={2018}
}
Recent works have been applying self-attention to various fields in computer vision and natural language processing. However, the memory and computational demands of existing self-attention operations grow quadratically with the spatiotemporal size of the input. This prohibits the application of self-attention on large inputs, e.g., long sequences, high-definition images, or large videos. To remedy this drawback, this paper proposes a novel decomposed attention (DA) module with substantially… Expand
Unifying Training and Inference for Panoptic Segmentation

References

SHOWING 1-10 OF 40 REFERENCES
Self-Attention Generative Adversarial Networks
Residual Attention Network for Image Classification
Squeeze-and-Excitation Networks
  • Jie Hu, L. Shen, Gang Sun
  • Computer Science
  • 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
  • 2018
Feature Pyramid Networks for Object Detection
R-FCN: Object Detection via Region-based Fully Convolutional Networks
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
Attention is All you Need
Non-local Neural Networks
...
1
2
3
4
...