DeText: A Deep Text Ranking Framework with BERT

@article{Guo2020DeTextAD,
  title={DeText: A Deep Text Ranking Framework with BERT},
  author={Weiwei Guo and X. Liu and Sida Wang and Huiji Gao and A. Sankar and Z. Yang and Q. Guo and Libao Zhang and B. Long and Bee-Chung Chen and Deepak Agarwal},
  journal={Proceedings of the 29th ACM International Conference on Information & Knowledge Management},
  year={2020}
}
  • Weiwei Guo, X. Liu, +8 authors Deepak Agarwal
  • Published 2020
  • Computer Science
  • Proceedings of the 29th ACM International Conference on Information & Knowledge Management
  • Ranking is the most important component in a search system. Most search systems deal with large amounts of natural language data, hence an effective ranking system requires a deep understanding of text semantics. Recently, deep learning based natural language processing (deep NLP) models have generated promising results on ranking systems. BERT is one of the most successful models that learn contextual embedding, which has been applied to capture complex query-document relations for search… CONTINUE READING

    References

    Publications referenced by this paper.
    SHOWING 1-7 OF 7 REFERENCES
    Learning deep structured semantic models for web search using clickthrough data
    • 1,018
    • Highly Influential
    • PDF
    A Latent Semantic Model with Convolutional-Pooling Structure for Information Retrieval
    • 447
    • Highly Influential
    • PDF
    BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
    • 10,289
    • Highly Influential
    • PDF
    Understanding the Behaviors of BERT in Ranking
    • 54
    • Highly Influential
    • PDF
    From RankNet to LambdaRank to LambdaMART: An Overview
    • 687
    • Highly Influential
    • PDF
    XGBoost: A Scalable Tree Boosting System
    • 5,664
    • Highly Influential
    • PDF
    Multilayer perceptron, fuzzy sets, and classification
    • 768
    • Highly Influential