Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets

@article{Chandler2012BreakingMW,
  title={Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets},
  author={D. Chandler and A. Kapelner},
  journal={ArXiv},
  year={2012},
  volume={abs/1210.0962}
}
We conduct the first natural field experiment to explore the relationship between the “meaningfulness” of a task and worker effort. We employed about 2500 workers from Amazon's Mechanical Turk (MTurk), an online labor market, to label medical images. Although given an identical task, we experimentally manipulated how the task was framed. Subjects in the meaningful treatment were told that they were labeling tumor cells in order to assist medical researchers, subjects in the zero-context… Expand
Curiosity Killed the Cat, but Makes Crowdwork Better
Improving learning through achievement priming in crowdsourced information finding microtasks
Using targeted design interventions to encourage extra‐role crowdsourcing behavior
Working on Low-Paid Micro-Task Crowdsourcing Platforms: An Existence, Relatedness and Growth View
Labor Allocation in Paid Crowdsourcing: Experimental Evidence on Positioning, Nudges and Prices
Why Individuals Participate in Micro-task Crowdsourcing Work Environment: Revealing Crowdworkers' Perceptions
Context Disclosure as a Source of Player Motivation in Human Computation Games
  • 2019
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 53 REFERENCES
The labor economics of paid crowdsourcing
The online laboratory: conducting experiments in a real labor market
Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk
Field Experiments
Man's search for meaning: The case of Legos
A validation of Amazon Mechanical Turk for the collection of acceptability judgments in linguistic theory
  • Jon Sprouse
  • Psychology, Medicine
  • Behavior research methods
  • 2011
Running Experiments on Amazon Mechanical Turk
Demographics of Mechanical Turk
The weirdest people in the world
...
1
2
3
4
5
...