Critically Examining the "Neural Hype": Weak Baselines and the Additivity of Effectiveness Gains from Neural Ranking Models

@article{Yang2019CriticallyET,
  title={Critically Examining the "Neural Hype": Weak Baselines and the Additivity of Effectiveness Gains from Neural Ranking Models},
  author={Wei Yang and Kuang Lu and Peilin Yang and Jimmy J. Lin},
  journal={Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval},
  year={2019}
}
  • Wei Yang, Kuang Lu, +1 author Jimmy J. Lin
  • Published 2019
  • Computer Science
  • Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval
Is neural IR mostly hype. [...] Key Result A significant improvement was observed for one of the models, demonstrating additivity in gains. While there appears to be merit to neural IR approaches, at least some of the gains reported in the literature appear illusory.Expand
Investigating the case of weak baselines in Ad-hoc Retrieval and Question Answering
Selective Weak Supervision for Neural Information Retrieval
Towards Axiomatic Explanations for Neural Ranking Models
Focal elements of neural information retrieval models. An outlook through a reproducibility study
Length Normalization in the Era of Neural Rankers
Curriculum Learning Strategies for IR: An Empirical Study on Conversation Response Ranking
GRADIENT BOOSTED DECISION TREES?
Slice-Aware Neural Ranking
OpenNIR: A Complete Neural Ad-Hoc Ranking Pipeline
...
1
2
3
4
5
...

References

SHOWING 1-7 OF 7 REFERENCES
UMass at TREC 2004: Novelty and HARD
End-to-End Neural Ad-hoc Ranking with Kernel Pooling
A Deep Relevance Matching Model for Ad-hoc Retrieval