An Active Learning Approach for Jointly Estimating Worker Performance and Annotation Reliability with Crowdsourced Data

@article{Zhao2014AnAL,
  title={An Active Learning Approach for Jointly Estimating Worker Performance and Annotation Reliability with Crowdsourced Data},
  author={Liyue Zhao and Yu Lin Zhang and Gita Reese Sukthankar},
  journal={ArXiv},
  year={2014},
  volume={abs/1401.3836}
}
Crowdsourcing platforms offer a practical solution to the problem of affordably annotating large datasets for training supervised classifiers. Unfortunately, poor worker performance frequently threatens to compromise annotation reliability, and requesting multiple labels for every instance can lead to large cost increases without guaranteeing good results. Minimizing the required training samples using an active learning selection procedure reduces the labeling requirement but can jeopardize… CONTINUE READING

Similar Papers

References

Publications referenced by this paper.
SHOWING 1-10 OF 14 REFERENCES

Incremental Relabeling for Active Learning with Noisy Crowdsourced Annotations

  • 2011 IEEE Third Int'l Conference on Privacy, Security, Risk and Trust and 2011 IEEE Third Int'l Conference on Social Computing
  • 2011
VIEW 1 EXCERPT

Active Learning from Multiple Noisy Labelers with Varied Costs

  • 2010 IEEE International Conference on Data Mining
  • 2010
VIEW 1 EXCERPT

Far-sighted active learning on a budget for image and video recognition

  • 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition
  • 2010
VIEW 1 EXCERPT