• Publications
  • Influence
Effective Crowd Annotation for Relation Extraction
TLDR
We show that crowdsourced annotation of training data boost performance for relation extraction over methods based solely on distant supervision. Expand
  • 51
  • 9
  • PDF
Crowdsourcing Multi-Label Classification for Taxonomy Creation
TLDR
This paper presents DELUGE, an improved workflow that produces taxonomies with comparable quality using significantly less crowd labor. Expand
  • 123
  • 7
  • PDF
MicroTalk: Using Argumentation to Improve Crowdsourcing Accuracy
TLDR
We present a new quality-control workflow, called MicroTalk, that requires some workers to Justify their reasoning and asks others to Reconsider their decisions after reading counter-arguments from workers with opposing views. Expand
  • 46
  • 6
  • PDF
Optimal Testing for Crowd Workers
TLDR
We formulate the problem of balancing between (1) testing workers to determine their accuracy and (2) actually getting work performed as a partially-observable Markov decision process (POMDP) and apply reinforcement learning to dynamically calculate the best policy. Expand
  • 32
  • 3
  • PDF
Parallel Task Routing for Crowdsourcing
TLDR
This paper defines a space of task routing problems, proves that even the simplest is NP-hard, and develops several approximation algorithms for parallel routing problems. Expand
  • 30
  • 2
  • PDF
Artificial Intelligence and Collective Intelligence
TLDR
The vision of collective intelligence is often manifested through an autonomous software module (agent) in a complex and uncertain environment. Expand
  • 14
  • 1
  • PDF
Subcontracting Microwork
TLDR
We argue that crowdwork platforms can improve their value proposition for all stakeholders by supporting subcontracting within microtasks, i.e. outsourcing one or more aspects of a microtask to additional workers. Expand
  • 16
  • PDF
Learning on the Job: Optimal Instruction for Crowdsourcing
A large body of crowdsourcing research focuses on using techniques from artificial intelligence to improve estimates of latent answers to questions, assuming fixed (latent) worker quality. Recently,Expand
  • 7
  • PDF
Cicero: Multi-Turn, Contextual Argumentation for Accurate Crowdsourcing
TLDR
We present Cicero, a new workflow that improves crowd accuracy on difficult tasks by engaging workers in multi-turn, contextual discussions through real-time, synchronous argumentation. Expand
  • 13
  • PDF
Sprout: Crowd-Powered Task Design for Crowdsourcing
TLDR
We propose a novel meta-workflow that helps requesters optimize crowdsourcing task designs and Sprout, our open-source tool, which implements this workflow. Expand
  • 9
  • PDF