Recurrent Transformer Networks for Semantic Correspondence

  title={Recurrent Transformer Networks for Semantic Correspondence},
  author={Seungryong Kim and Stephen Lin and Sangryul Jeon and Dongbo Min and Kwanghoon Sohn},
We present recurrent transformer networks (RTNs) for obtaining dense correspondences between semantically similar images. Our networks accomplish this through an iterative process of estimating spatial transformations between the input images and using these transformations to generate aligned convolutional activations. By directly estimating the transformations between an image pair, rather than employing spatial transformer networks to independently normalize each individual image, we show… CONTINUE READING
This paper has been referenced on Twitter 9 times. VIEW TWEETS

From This Paper

Topics from this paper.


Publications referenced by this paper.
Showing 1-10 of 45 references