Frustration and ennui among Amazon MTurk workers

Craig Fowler, Jian Jiao, Margaret Pitts

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

Academics are increasingly turning to crowdsourcing platforms to recruit research participants. Their endeavors have benefited from a proliferation of studies attesting to the quality of crowdsourced data or offering guidance on managing specific challenges associated with doing crowdsourced research. Thus far, however, relatively little is known about what it is like to be a participant in crowdsourced research. Our analysis of almost 1400 free-text responses provides insight into the frustrations encountered by workers on one widely used crowdsourcing site: Amazon’s MTurk. Some of these frustrations stem from inherent limitations of the MTurk platform and cannot easily be addressed by researchers. Many others, however, concern factors that are directly controllable by researchers and that may also be relevant for researchers using other crowdsourcing platforms such as Prolific or CrowdFlower. Based on participants’ accounts of their experiences as crowdsource workers, we offer recommendations researchers might consider as they seek to design online studies that demonstrate consideration for respondents and respect for their time, effort, and dignity.

Original languageEnglish (US)
Pages (from-to)3009-3025
Number of pages17
JournalBehavior Research Methods
Volume55
Issue number6
DOIs
StatePublished - Sep 2023

Keywords

  • Crowdsourcing
  • Digital methods
  • Ethics
  • Internet
  • Job satisfaction
  • Online research
  • Participants

ASJC Scopus subject areas

  • Experimental and Cognitive Psychology
  • Developmental and Educational Psychology
  • Arts and Humanities (miscellaneous)
  • Psychology (miscellaneous)
  • General Psychology

Fingerprint

Dive into the research topics of 'Frustration and ennui among Amazon MTurk workers'. Together they form a unique fingerprint.

Cite this