AuDi: an Auto-Feedback Display for Crowdsourcing

dc.contributor.authorTang, Xinru
dc.contributor.authorZhao, Dongyang
dc.contributor.authorZhang, Ying
dc.contributor.authorDing, Xianghua
dc.date.accessioned2019-04-23T20:35:28Z
dc.date.available2019-04-23T20:35:28Z
dc.date.issued2019
dc.description.abstractWhile feedback, by experts or peers, is found to have positive effects on crowdsourcing work, it is a costly approach as more people or time is involved in order to provide feedback. This paper explores an automatic feedback display called AuDi for crowdsourcing. AuDi shows the worker’s accuracy rate, which is automatically calculated with the use of an accuracy algorithm, by changing the background color of the task page. We conducted an experimental study with AuDi in the field, and employed both quantitative and qualitative methods for data collection and analysis. Our study shows that, without introducing new cost, such an auto-feedback display is well received by our participants, gives them assurance and more confidence, and also positively contributes to work performance by pushing them to study more and understand better the task requirements.en
dc.identifier.doi10.18420/ecscw2019_ep05
dc.identifier.pissn2510-2591
dc.language.isoen
dc.publisherEuropean Society for Socially Embedded Technologies (EUSSET)
dc.relation.ispartofProceedings of 17th European Conference on Computer-Supported Cooperative Work
dc.relation.ispartofseriesReports of the European Society for Socially Embedded Technologies: vol. 3, no. 1
dc.titleAuDi: an Auto-Feedback Display for Crowdsourcingen
dc.typeText/Conference Paper
gi.conference.date8 - 12 June 2019
gi.conference.locationSalzburg, Austria
mci.conference.reviewfull

Files

Original bundle
1 - 1 of 1
Loading...
Thumbnail Image
Name:
ecscw2019_ep05.pdf
Size:
524.49 KB
Format:
Adobe Portable Document Format