Learn More
Crowdsourcing is an effective tool for scalable data annotation in both research and enterprise contexts. Due to crowd-sourcing's open participation model, quality assurance is critical to the success of any project. Present methods rely on EM-style post-processing or manual annotation of large gold standard sets. In this paper we present an automated(More)
The use of crowdsourcing platforms like Amazon Mechanical Turk for evaluating the relevance of search results has become an effective strategy that yields results quickly and inexpensively. One approach to ensure quality of worker judgments is to include an initial training period and subsequent sporadic insertion of predefined gold standard data (training(More)
Crowdsourced crisis response harnesses distributed networks of humans in combination with information and communication technology (ICT) to create scalable, flexible and rapid communication systems that promote well-being, survival, and recovery during the acute phase of an emergency. In this paper, we analyze a recent experience in which CrowdFlower(More)
  • 1