Crowd Coach

@article{Chiang2018CrowdC,
  title={Crowd Coach},
  author={Chun-Wei Chiang and Anna Kasunic and Saiph Savage},
  journal={Proceedings of the ACM on Human-Computer Interaction},
  year={2018},
  volume={2},
  pages={1 - 17}
}
Traditional employment usually provides mechanisms for workers to improve their skills to access better opportunities. [] Key Method To further facilitate crowd workers' skill growth, we present Crowd Coach, a system that enables workers to receive peer coaching while on the job. We conduct a field experiment and real world deployment to study Crowd Coach in the wild. Hundreds of workers used Crowd Coach in a variety of tasks, including writing, doing surveys, and labeling images. We find that Crowd Coach…

Figures and Tables from this paper

Crowd-Worker Skill Improvement with AI Co-Learners
TLDR
A workflow in which AI serves as co-learners so that workers do not need pre-trained AI models and thus training data for them and the experimental results show that the self-correction framework with AI as a co-learner is as effective in improving skills as self-Correction with a pre- trained AI.
Turker Tales: Integrating Tangential Play into Crowd Work
TLDR
Turker Tales, a Google Chrome extension that uses tangential play to encourage crowd workers to write, share, and view short tales as a side activity to their main job on Amazon Mechanical Turk, is presented.
TurkScanner: Predicting the Hourly Wage of Microtasks
TLDR
This study explores various computational methods for predicting the working times (and thus hourly wages) required for tasks based on data collected from other workers completing crowd work, and explores the challenge of accurately recording working times both automatically and by asking workers.
Predicting the Working Time of Microtasks Based on Workers' Perception of Prediction Errors
TLDR
A computational technique for predicting microtask working times based on past experiences of workers regarding similar tasks and challenges encountered in defining evaluation and/or objective functions have been described based on the tolerance demonstrated by workers with regard to prediction errors.
Becoming the Super Turker:Increasing Wages via a Strategy from High Earning Workers
TLDR
This paper explores how novice workers can improve their earnings by following the transparency criteria of Super Turkers, i.e., crowd workers who earn higher salaries on Amazon Mechanical Turk (MTurk), and highlights that tool development to support crowd workers should be paired with educational opportunities that teach workers how to effectively use the tools.
Becoming the Super Turker: Increasing Wages via a Strategy from High Earning Workers
TLDR
This paper explores how novice workers can improve their earnings by following the transparency criteria of Super Turkers, i.e., crowd workers who earn higher salaries on Amazon Mechanical Turk (MTurk), and finds that novices who utilized this Super Turker criteria earned better wages than other novice.
"I Hope This Is Helpful"
TLDR
Findings from a thematic analysis of 1,064 comments left by Amazon Mechanical Turk workers using this task design to create captions for images taken by people who are blind are discussed.
The Influences of Task Design on Crowdsourced Judgement: A Case Study of Recidivism Risk Evaluation
Crowdsourcing is widely used to solicit judgement from people in diverse applications ranging from evaluating information quality to rating gig worker performance. To encourage the crowd to put in
Making AI Machines Work for Humans in FoW
TLDR
Bringing humans back to the frontier of FoW will increase their trust in AI systems and shift their perception to use them as a source of self-improvement, ensure better work performance, and positively shape social and economic outcomes of a society and a nation.
The Tools of Management
TLDR
The historical development of workplace technology design methods in CSCW is analyzed to show how mid-20th century labor responses to scientific management can inform directions in contemporary digital labor advocacy.
...
...

References

SHOWING 1-10 OF 100 REFERENCES
Scopist: Building a Skill Ladder into Crowd Transcription
TLDR
Scopist is created, a JavaScript application for learning an efficient text-entry method known as stenotype while doing audio transcription tasks, and demonstrates a new way for workers on crowd platforms to align their work and skill development with the accessibility domain while they work.
Toward a Learning Science for Complex Crowdsourcing Tasks
TLDR
This work explores how crowdworkers can be trained to tackle complex crowdsourcing tasks and shows that having workers validate the work of their peer workers can be even more effective than having them review expert examples if the authors only present solutions filtered by a threshold length.
Combining crowdsourcing and learning to improve engagement and performance
TLDR
A platform that combines learning and crowdsourcing to benefit both the workers and the requesters is described, which found that by using the system workers gained new skills and produced high-quality edits for requested images, even if they had little prior experience editing images.
Collaboratively crowdsourcing workflows with turkomatic
TLDR
It is argued that Turkomatic's collaborative approach can be more successful than the conventional workflow design process and implications for the design of collaborative crowd planning systems are discussed.
Is Crowdsourcing a Source of Worker Empowerment or Exploitation? Understanding Crowd Workers' Perceptions of Crowdsourcing Career
TLDR
Assessing the degree to which these platforms afford or constrain the workers to exert their personal agencies will partially determine whether these new forms of work are a harbinger of worker empowerment or exploitation.
Reviewing versus doing: learning and performance in crowd assessment
TLDR
If and how workers learn and improve their performance in a task domain by serving as peer reviewers is explored, and whether peer reviewing may be more effective in teams where the reviewers can reach consensus through discussion is investigated.
A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk
TLDR
The characteristics of tasks and working patterns that yield higher hourly wages are explored, and platform design and worker tools are informed to create a more positive future for crowd work.
Paid Crowdsourcing as a Vehicle for Global Development
By connecting remote workers to a global marketplace, paid crowdsourcing has the potential to improve earnings and livelihoods in poor communities around the world. However, there is a long way to go
Atelier: Repurposing Expert Crowdsourcing Tasks as Micro-internships
TLDR
Atelier, a micro-internship platform that connects crowd interns with crowd mentors, guides mentor-intern pairs to break down expert crowdsourcing tasks into milestones, review intermediate output, and problem-solve together, finding that Atelier helped interns maintain forward progress and absorb best practices.
The future of crowd work
TLDR
This paper outlines a framework that will enable crowd work that is complex, collaborative, and sustainable, and lays out research challenges in twelve major areas: workflow, task assignment, hierarchy, real-time response, synchronous collaboration, quality control, crowds guiding AIs, AIs guiding crowds, platforms, job design, reputation, and motivation.
...
...