Adversarial Cross-Modal Retrieval

@inproceedings{Wang2017AdversarialCR,
  title={Adversarial Cross-Modal Retrieval},
  author={Bokun Wang and Yang Yang and Xing Xu and Alan Hanjalic and Heng Tao Shen},
  booktitle={ACM Multimedia},
  year={2017}
}
Cross-modal retrieval aims to enable flexible retrieval experience across different modalities (e.g., texts vs. images). The core of cross-modal retrieval research is to learn a common subspace where the items of different modalities can be directly compared to each other. In this paper, we present a novel Adversarial Cross-Modal Retrieval (ACMR) method, which seeks an effective common subspace based on adversarial learning. Adversarial learning is implemented as an interplay between two… CONTINUE READING
Highly Cited
This paper has 36 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 1 time over the past 90 days. VIEW TWEETS

References

Publications referenced by this paper.
Showing 1-10 of 10 references

Similar Papers

Loading similar papers…