Recurrent Transformer Networks for Semantic Correspondence

@article{Kim2018RecurrentTN,
  title={Recurrent Transformer Networks for Semantic Correspondence},
  author={Seungryong Kim and Stephen Lin and Sangryul Jeon and Dongbo Min and Kwanghoon Sohn},
  journal={CoRR},
  year={2018},
  volume={abs/1810.12155}
}
We present recurrent transformer networks (RTNs) for obtaining dense correspondences between semantically similar images. Our networks accomplish this through an iterative process of estimating spatial transformations between the input images and using these transformations to generate aligned convolutional activations. By directly estimating the transformations between an image pair, rather than employing spatial transformer networks to independently normalize each individual image, we show… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 45 references

Similar Papers

Loading similar papers…