Evaluation Measures for Relevance and Credibility in Ranked Lists
The web content is the main source of information for many users. However, due to the open nature of today's web anyone can produce and publish content, which, as a result, is not always reliable. As such, mechanisms to evaluate the web content credibility are needed. In this paper, we describe CredibleWeb, a prototype crowdsourcing platform for web content evaluation with a two-fold goal: (1) to build a social enhanced and large scale dataset of credibility labeled web pages that enables the evaluation of different strategies for web credibility prediction, and (2) to investigate how various design elements are useful in engaging users to actively evaluate web pages credibility. We outline the challenges related with the design of a crowdsourcing platform for web credibility evaluation and describe our initial efforts.