• Corpus ID: 53046825

Mechanical Turk and Financial Dependency on Crowdsourcing

@inproceedings{Bogert2018MechanicalTA,
  title={Mechanical Turk and Financial Dependency on Crowdsourcing},
  author={Eric Bogert},
  booktitle={AMCIS},
  year={2018}
}
Some workers on Amazon Mechanical Turk (AMT) are dependent on income from AMT to pay for their basic needs. This could create two countervailing forces. Workers have incentives to work quickly, to maximize their income. However, if workers work too quickly, resulting in poor quality, they won't have access to the highest paying tasks on the platform. This is a feature of work that is paid per-task, rather than per-hour. This paper investigates whether workers who are financially dependent on… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 15 REFERENCES
A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk
TLDR
The characteristics of tasks and working patterns that yield higher hourly wages are explored, and platform design and worker tools are informed to create a more positive future for crowd work.
Who are the Turkers? Worker Demographics in Amazon Mechanical Turk
Amazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is becoming increasingly popular
Running Experiments on Amazon Mechanical Turk
textabstractAlthough Mechanical Turk has recently become popular among social scientists as a source of experimental data, doubts may linger about the quality of data provided by subjects recruited
Cost-Effective Quality Assurance in Crowd Labeling
TLDR
This paper considers labeling tasks and develops a comprehensive scheme for managing the quality of crowd labeling and introduces two novel metrics that can be used to objectively rank the performance of crowdsourced workers.
Demographics and Dynamics of Mechanical Turk Workers
TLDR
An analysis of the population dynamics and demographics of Amazon Mechanical Turk workers based on the results of the survey, with more than 85K responses from 40K unique participants, indicates that there are more than 100K workers available in Amazon»s crowdsourcing platform, the participation of the workers in the platform follows a heavy-tailed distribution, and at any given time there are over 2K active workers.
Lessons Learned from Crowdsourcing Complex Engineering Tasks
TLDR
The limits of crowdsourcing were explored by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations, which compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students.
Data collection in a flat world: the strengths and weaknesses of mechanical turk samples
TLDR
MTurk offers a highly valuable opportunity for data collection, and it is recommended that researchers using MTurk include screening questions that gauge attention and language comprehension, avoid questions with factual answers, and consider how individual differences in financial and social domains may influence results.
Amazon's Mechanical Turk
TLDR
Findings indicate that MTurk can be used to obtain high-quality data inexpensively and rapidly and the data obtained are at least as reliable as those obtained via traditional methods.
Analyzing the Amazon Mechanical Turk marketplace
An associate professor at New York Universitys Stern School of Business uncovers answers about who are the employers in paid crowdsourcing, what tasks they post, and how much they pay.
Separating the Shirkers from the Workers? Making Sure Respondents Pay Attention on Self‐Administered Surveys
Good survey and experimental research requires subjects to pay attention to questions and treatments, but many subjects do not. In this article, we discuss “Screeners” as a potential solution to this
...
...