Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks

@article{Severyn2015LearningTR,
  title={Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks},
  author={Aliaksei Severyn and Alessandro Moschitti},
  journal={Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval},
  year={2015}
}
  • Aliaksei Severyn, Alessandro Moschitti
  • Published 2015
  • Computer Science
  • Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval
  • Learning a similarity function between pairs of objects is at the core of learning to rank approaches. In information retrieval tasks we typically deal with query-document pairs, in question answering -- question-answer pairs. However, before learning can take place, such pairs needs to be mapped from the original space of symbolic words into some feature space encoding various aspects of their relatedness, e.g. lexical, syntactic and semantic. Feature engineering is often a laborious task and… CONTINUE READING

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 464 CITATIONS, ESTIMATED NaN% COVERAGE

    An Experimental Analysis of Multi-Perspective Convolutional Neural Networks

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Attention-based neural network for short-text question answering

    VIEW 12 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Knowledge-aware Attentive Neural Network for Ranking Question Answer Pairs

    VIEW 12 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Ranking Paragraphs for Improving Answer Recall in Open-Domain Question Answering

    VIEW 6 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Temporal Context Modeling for Text Streams

    VIEW 20 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Neural information retrieval: at the end of the early years

    VIEW 13 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    End to End Long Short Term Memory Networks for Non-Factoid Question Answering

    VIEW 10 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS
    HIGHLY INFLUENCED

    Interactive knowledge-enhanced attention network for answer selection

    VIEW 6 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    A Stacked BiLSTM Neural Network Based on Coattention Mechanism for Question Answering

    VIEW 5 EXCERPTS
    CITES RESULTS & BACKGROUND
    HIGHLY INFLUENCED

    Multi-Scale Deformable CNN for Answer Selection

    VIEW 10 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2013
    2020

    CITATION STATISTICS

    • 64 Highly Influenced Citations

    • Averaged 113 Citations per year from 2017 through 2019

    References

    Publications referenced by this paper.
    SHOWING 1-6 OF 6 REFERENCES

    Deep Learning for Answer Sentence Selection

    VIEW 12 EXCERPTS
    HIGHLY INFLUENTIAL

    ADADELTA: An Adaptive Learning Rate Method

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA

    VIEW 13 EXCERPTS
    HIGHLY INFLUENTIAL

    Convolutional Neural Networks for Sentence Classification

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL