Rethinking the recommender research ecosystem: reproducibility, openness, and LensKit

  title={Rethinking the recommender research ecosystem: reproducibility, openness, and LensKit},
  author={Michael D. Ekstrand and M. Ludwig and J. Konstan and J. Riedl},
  booktitle={RecSys '11},
  • Michael D. Ekstrand, M. Ludwig, +1 author J. Riedl
  • Published in RecSys '11 2011
  • Computer Science
  • Recommender systems research is being slowed by the difficulty of replicating and comparing research results. Published research uses various experimental methodologies and metrics that are difficult to compare. It also often fails to sufficiently document the details of proposed algorithms or the evaluations employed. Researchers waste time reimplementing well-known algorithms, and the new implementations may miss key details from the original algorithm or its subsequent refinements. When… CONTINUE READING
    149 Citations
    Towards reproducibility in recommender-systems research
    • 57
    • Highly Influenced
    • PDF
    Comparative recommender system evaluation: benchmarking recommendation frameworks
    • 136
    • PDF
    Mix and Rank: A Framework for Benchmarking Recommender Systems
    OpenRec: A Modular Framework for Extensible and Adaptable Recommendation Algorithms
    • 25
    • PDF
    Toward identification and adoption of best practices in algorithmic recommender systems research
    • 17
    Reproducibility of Experiments in Recommender Systems Evaluation
    • 4
    • PDF


    Evaluating collaborative filtering over time
    • 48
    • Highly Influential
    • PDF
    Evaluating the dynamic properties of recommendation algorithms
    • 30
    • Highly Influential
    • PDF
    Improving regularized singular value decomposition for collaborative filtering
    • 657
    • Highly Influential
    • PDF
    Netflix update: Try this at home
    • ̃simon/journal/20061211.html,
    • 2006