Investigating the Amazon Mechanical Turk Market Through Tool Design
Fulltext URI
Document type
Additional Information
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
We developed TurkBench to better understand the work of crowdworkers on the Amazon Mechanical Turk (AMT) marketplace. While we aimed to reduce the amount of invisible, unpaid work that these crowdworkers performed, we also probed the day-to-day practices of crowdworkers. Through this probe we encountered a number of previously unreported difficulties that are representative of the difficulties that crowdworkers face in both building their own tools and working on AMT. In this article, our contributions are insights into 1) a number of breakdowns that are occurring on AMT and 2) how the AMT platform is being appropriated in ways that, at the same time, mitigate some breakdowns while exacerbating others. The breakdowns that we specifically discuss in this paper, are the increasing velocity of the market (good HITs are grabbed within seconds), the high amount of flexibility that requesters can and do exercise in specifying their HITs, and the difficulty crowdworkers had in navigating the market due to the large amount of variation in how HITs were constructed by requesters. When the velocity of the market is combined with a poor search interface, a large amount in variation in how HITs are constructed, and little infrastructural support for workers, the resulting work environment can be frustrating and difficult to thrive in.
Description
Keywords
Citation
URI
Collections
Endorsement
Review
Supplemented By
Referenced By
Number of citations to item: 7
- Rachel N. Simons, Danna Gurari, Kenneth R. Fleischmann (2020): "I Hope This Is Helpful", In: Proceedings of the ACM on Human-Computer Interaction CSCW2(4), doi:10.1145/3415176
- Benjamin V. Hanrahan, Anita Chen, JiaHua Ma, Ning F. Ma, Anna Squicciarini, Saiph Savage (2021): The Expertise Involved in Deciding which HITs are Worth Doing on Amazon Mechanical Turk, In: Proceedings of the ACM on Human-Computer Interaction CSCW1(5), doi:10.1145/3449202
- Haoyu Xie, Alessandro Checco, Efpraxia D. Zamani (2023): The Unintended Consequences of Automated Scripts in Crowdwork Platforms: A Simulation Study in MTurk, In: Information Systems Frontiers 1(26), doi:10.1007/s10796-023-10373-x
- Lars Osterbrink, Paul Alpar (2021): Silence of crowdworkers—reasons and implications for work conditions and quality, In: International Studies of Management & Organization 2(51), doi:10.1080/00208825.2021.1927311
- Anita Chen, Chien-Wen Yuan, Ning F. Ma, Chi-Yang Hsu, Benjamin V. Hanrahan (2019): Navigating Ride-Sharing Regulations, In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, doi:10.1145/3290605.3300366
- Annabel Rothschild, Justin Booker, Christa Davoll, Jessica Hill, Venise Ivey, Carl DiSalvo, Ben Rydal Shapiro, Betsy DiSalvo (2022): Towards fair and pro-social employment of digital pieceworkers for sourcing machine learning training data, In: CHI Conference on Human Factors in Computing Systems Extended Abstracts, doi:10.1145/3491101.3516384
- Uma Rani, Rishabh Kumar Dhir, Nora Gobel (2023): Work on Online Labour Platforms: Does Formal Education Matter?, In: Dynamics of Virtual Work, doi:10.1007/978-3-031-11462-5_3