Incentives for Truthful Evaluations

  title={Incentives for Truthful Evaluations},
  author={Luca de Alfaro and Marco Faella and Vassilis Polychronopoulos and Michael Shavlovsky},
We consider crowdsourcing problems where the workers are asked to provide evaluations for items; the worker evaluations are then used to estimate the true quality of items. Lacking an incentive scheme, workers have no motive in making effort in completing the evaluations, providing inaccurate answers instead. We show that a simple approach of providing incentives by assessing randomly chosen workers is not scalable: to guarantee an incentive to be truthful the number of workers that the… CONTINUE READING
Related Discussions
This paper has been referenced on Twitter 7 times. VIEW TWEETS

From This Paper

Figures, tables, and topics from this paper.


Publications citing this paper.

Similar Papers

Loading similar papers…