Loading…

Prototype Tasks: Improving Crowdsourcing Results through Rapid, Iterative Task Design

Low-quality results have been a long-standing problem on microtask crowdsourcing platforms, driving away requesters and justifying low wages for workers. To date, workers have been blamed for low-quality results: they are said to make as little effort as possible, do not pay attention to detail, and...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2017-07
Main Authors: Snehalkumar "Neil" S Gaikwad, Chhibber, Nalin, Sehgal, Vibhor, Ballav, Alipta, Mullings, Catherine, Ahmed, Nasser, Richmond-Fuller, Angela, Gilbee, Aaron, Gamage, Dilrukshi, Whiting, Mark, Zhou, Sharon, Matin, Sekandar, Niranga, Senadhipathige, Goyal, Shirish, Majeti, Dinesh, Srinivas, Preethi, Ginzberg, Adam, Mananova, Kamila, Ziulkoski, Karolina, Regino, Jeff, Sarma, Tejas, Sinha, Akshansh, Abhratanu, Paul, Diemert, Christopher, Murag, Mahesh, Dai, William, Pandey, Manoj, Vaish, Rajan, Bernstein, Michael
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Low-quality results have been a long-standing problem on microtask crowdsourcing platforms, driving away requesters and justifying low wages for workers. To date, workers have been blamed for low-quality results: they are said to make as little effort as possible, do not pay attention to detail, and lack expertise. In this paper, we hypothesize that requesters may also be responsible for low-quality work: they launch unclear task designs that confuse even earnest workers, under-specify edge cases, and neglect to include examples. We introduce prototype tasks, a crowdsourcing strategy requiring all new task designs to launch a small number of sample tasks. Workers attempt these tasks and leave feedback, enabling the re- quester to iterate on the design before publishing it. We report a field experiment in which tasks that underwent prototype task iteration produced higher-quality work results than the original task designs. With this research, we suggest that a simple and rapid iteration cycle can improve crowd work, and we provide empirical evidence that requester "quality" directly impacts result quality.
ISSN:2331-8422