Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com

@article{Hirth2011AnatomyOA,
  title={Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com},
  author={Matthias Hirth and Tobias Hossfeld and Phuoc Tran-Gia},
  journal={2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing},
  year={2011},
  pages={322-329}
}
  • Matthias Hirth, T. Hossfeld, P. Tran-Gia
  • Published 30 June 2011
  • Computer Science
  • 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing
Since Jeff Howe introduced the term "crowdsourcing" in 2006 for the first time, crowd sourcing has be come a growing market in the current Internet. Thousands of workers categorize images, write articles or perform other small tasks on platforms like Amazon Mechanical Turk (MTurk), Micro workers or Short Task. In this work, we want to give an inside view of the usage data from Micro workers and show that there are significant differences to the well studied MTurk. Further, we have a look at… 
A Study on the Evolution of Crowdsourcing Websites
TLDR
A study comparing how crowdsourcing platforms have evolved over two time periods shows that several website characteristic are strong indicators of its attempting to get and optimize potential workers or requesters attention.
Modeling of crowdsourcing platforms and granularity of work organization in Future Internet
TLDR
A deterministic fluid model is developed which is an extension of the SIR model of epidemics, in order to investigate the platform dynamics of a crowdsourcing platform, using the Microworkers.com platform as example.
An Empirical Survey on Crowdsourcing-Based Data Management Techniques
TLDR
This research identifies three primary considerations for improving data management in crowd sources, which are recognizing images and sentiment analysis, and enhancing human intellectual capability.
Quality factors of crowdsourcing system: Paper review
TLDR
This paper explores and analyzes how the task affected by the three main components of crowdsourcing: Job Provider, Crowd Workers, and Platform affects the quality of products and services delivered through the Internet.
Impact of Task Recommendation Systems in Crowdsourcing Platforms
TLDR
It is shown that even simple recommendation systems lead to improvements for most platform users, however, the results also indicate and shall raise the awareness that a small fraction of users is also negatively affected by those systems.
Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments
This report documents the program and the outcomes of Dagstuhl Seminar 15481 “Evaluation in the Crowd: Crowdsourcing and Human-Centred Experiments”. Human-centred empirical evaluations play important
The four pillars of crowdsourcing: A reference model
TLDR
A taxonomy is meant to represent the different configurations of crowdsourcing in its main four pillars: the crowdsourcer, the crowd, the crowdsourced task and the crowdsourcing platform.
Crowdsourcing Technology to Support Academic Research
TLDR
In this chapter, the possibilities for practical improvement of academic crowdsourced studies through adaption of technological solutions are discussed.
Task Recommendation in Crowdsourcing Platforms
TLDR
The requirements towards task recommendation in task distribution platforms are gathered with a focus on the worker's perspective, the design of appropriate assignment strategies is described, and innovative methods to recommend tasks based on their textual descriptions are provided.
...
...

References

SHOWING 1-10 OF 28 REFERENCES
Who are the crowdworkers?: shifting demographics in mechanical turk
TLDR
How the worker population has changed over time is described, shifting from a primarily moderate-income, U.S. based workforce towards an increasingly international group with a significant population of young, well-educated Indian workers.
Who are the Turkers? Worker Demographics in Amazon Mechanical Turk
Amazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is becoming increasingly popular
Cost-Optimal Validation Mechanisms and Cheat-Detection for Crowdsourcing Platforms
TLDR
Two crowd-based approaches are presented to validate the submitted work and their detection quality, their costs and their applicability to different types of typical crowd sourcing tasks are evaluated.
Analyzing the Amazon Mechanical Turk marketplace
An associate professor at New York Universitys Stern School of Business uncovers answers about who are the employers in paid crowdsourcing, what tasks they post, and how much they pay.
WIKIPEDIA
The decentralized participatory architecture of the Internet challenges traditional knowledge authorities and hierarchies. Questions arise about whether lay inclusion helps to ‘democratize’ knowledge
Definition of Lead
  • en.wikipedia.org/wiki/Payper lead [Accessed: Jan. 14, 2010].
  • 2010
Mar.) Mechanical Turk: The Demogra phics
  • [Online]. Available: http://behind-the-enemy-lines.blo gspot.com/2008/
  • 2008
PayPal
  • www.paypal.com [Accessed: Mar. 08, 2011].
  • 2011
OpenStreetMap
Microworkers, " www.microworkers.com [Accessed: Mar
  • Microworkers, " www.microworkers.com [Accessed: Mar
  • 2011
...
...