AuDi: an Auto-Feedback Display for Crowdsourcing
Fulltext URI
Document type
Additional Information
Date
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
While feedback, by experts or peers, is found to have positive effects on crowdsourcing work, it is a costly approach as more people or time is involved in order to provide feedback. This paper explores an automatic feedback display called AuDi for crowdsourcing. AuDi shows the worker’s accuracy rate, which is automatically calculated with the use of an accuracy algorithm, by changing the background color of the task page. We conducted an experimental study with AuDi in the field, and employed both quantitative and qualitative methods for data collection and analysis. Our study shows that, without introducing new cost, such an auto-feedback display is well received by our participants, gives them assurance and more confidence, and also positively contributes to work performance by pushing them to study more and understand better the task requirements.
Description
Keywords
Citation
URI
URI
Collections
Endorsement
Review
Supplemented By
Referenced By
Load citations