Recurrent Transformer Networks for Semantic Correspondence

@article{Kim2018RecurrentTN,
  title={Recurrent Transformer Networks for Semantic Correspondence},
  author={Seungryong Kim and Stephen Lin and Sangryul Jeon and Dongbo Min and Kwanghoon Sohn},
  journal={CoRR},
  year={2018},
  volume={abs/1810.12155}
}
We present recurrent transformer networks (RTNs) for obtaining dense correspondences between semantically similar images. Our networks accomplish this through an iterative process of estimating spatial transformations between the input images and using these transformations to generate aligned convolutional activations. By directly estimating the transformations between an image pair, rather than employing spatial transformer networks to independently normalize each individual image, we show… CONTINUE READING

From This Paper

Figures and tables from this paper.

References

Publications referenced by this paper.
Showing 1-10 of 45 references

Convolutional neural network architecture for geometric matching.

IEEE transactions on pattern analysis and machine intelligence • 2018

Proposal Flow: Semantic Correspondences from Object Proposals

IEEE Transactions on Pattern Analysis and Machine Intelligence • 2018

Deep Semantic Feature Matching

2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) • 2017

FCSS: Fully Convolutional Self-Similarity for Dense Semantic Correspondence

2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) • 2017

Inverse Compositional Spatial Transformer Networks

2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) • 2017

Object-Aware Dense Semantic Correspondence

2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) • 2017

Similar Papers

Loading similar papers…