• Corpus ID: 9684400

Overview of EIREX 2012: Social Media

  title={Overview of EIREX 2012: Social Media},
  author={Juli{\'a}n Urbano and M{\'o}nica Marrero and Diego Mart{\'i}n and Jorge Luis Morato Lara},
The third Information Retrieval Education through EXperimentation track (EIREX 2012) was run at the University Carlos III of Madrid, during the 2012 spring semester. EIREX 2012 is the third in a series of experiments designed to foster new Information Retrieval (IR) education methodologies and resources, with the specific goal of teaching undergraduate IR courses from an experimental perspective. For an introduction to the motivation behind the EIREX experiments, see the first sections of… 

Figures and Tables from this paper


Overview of EIREX 2010: Computing
This overview paper summarizes the results of the EIREX 2010 track, focusing on the creation of the test collection and the analysis to assess its reliability.
Overview of EIREX 2011: Crowdsourcing
This overview paper summarizes the results of the EIREX 2011 track, focusing on the creation of the test collection and the analysis to assess its reliability.
Bringing undergraduate students closer to a real-world information retrieval setting: methodology and resources
A pilot experiment to update the program of an Information Retrieval course for Computer Science undergraduates shows that this methodology is indeed reliable and feasible, and so the students plan on improving and keep using it in the next years, leading to a public repository of resources for Information retrieval courses.
Information Retrieval Meta-Evaluation: Challenges and Opportunities in the Music Domain
A survey of past meta-evaluation work in the context of Text Information Retrieval argues that the music community still needs to address various issues concerning the evaluation of music systems and the IR cycle, pointing out directions for further research and proposals in this line.
Crawling the web for structured documents
This demo describes a distributed and focused web crawler for any kind of structured documents, and it is shown how to exploit general-purpose resources to gather large amounts of real-world structured documents off the Web.
The University Carlos III of Madrid at TREC 2011 Crowdsourcing Track
The participation of the uc3m team in both tasks of the TREC 2011 Crowdsourcing Track is described, and according to the NIST gold labels, the runs performed very well in both task, ranking at the top for most measures.
How reliable are the results of large-scale information retrieval experiments?
A detailed empirical investigation of the TREC results shows that the measured relative performance of systems appears to be reliable, but that recall is overestimated: it is likely that many relevant documents have not been found.
The Philosophy of Information Retrieval Evaluation
The fundamental assumptions and appropriate uses of the Cranfield paradigm, especially as they apply in the context of the evaluation conferences, are reviewed.
Recuperación y Acceso a la Información
  • Recuperación y Acceso a la Información
  • 2010
TREC: Experiment and evaluation in information retrieval
One of recommendation of the book that you need to read is shown, which is a kind of precious book written by an experienced author and the reasonable reasons why you should read this book are shown.