ABSTRACT
In addition to simple form filling, there is an increasing need for crowdsourcing workers to perform freeform interactions directly on content in microtask crowdsourcing (e.g. proofreading articles or specifying object boundary in an image). Such microtasks are often organized within well-designed workflows to optimize task quality and workload distribution. However, designing and implementing the interface and workflow for such microtasks is challenging because it typically requires programming knowledge and tedious manual effort. We present ReTool, a web-based tool for requesters to design and publish interactive microtasks and workflows by demonstrating the microtasks for text and image content. We evaluated ReTool against a task-design tool from a popular crowdsourcing platform and showed the advantages of ReTool over the existing approach.
Supplemental Material
- Amazon. 2016. Amazon Mechanical Turk Homepage. (2016). https://www.mturk.com.Google Scholar
- Amazon. 2016. Developer Tools and Software Development Kits (SDKs) of Amazon Mechanical Turk. (2016). https://requester.mturk.com/developer/tools.Google Scholar
- Paul André, Aniket Kittur, and Steven P. Dow. 2014. Crowd synthesis: extracting categories and clusters from complex data. In Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing CSCW '14. ACM Press, New York, New York, USA, 989--998. DOI: http://dx.doi.org/10.1145/2531602.2531653 Google ScholarDigital Library
- Michael S Bernstein, Joel Brandt, Robert C Miller, and David R Karger. 2011. Crowds in Two Seconds: Enabling Realtime Crowd-powered Interfaces. In Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11. ACM Press, New York, New York, USA, 33. DOI: http://dx.doi.org/10.1145/2047196.2047201 Google ScholarDigital Library
- M S Bernstein, G Little, R C Miller, Björn Hartmann, Mark S. Ackerman, David R. Karger, David Crowell, and Katrina Panovich. 2010. Soylent: a word processor with a crowd inside. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 313--322. DOI: http://dx.doi.org/10.1145/1866029.1866078 Google ScholarDigital Library
- Jeffrey P Bigham, Samual Samuel White, Tom Yeh, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C Robin Miller, Aubrey Tatarowicz, and Brandyn White. 2010. VizWiz: Nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10. 333--342. DOI: http://dx.doi.org/10.1145/1866029.1866080 Google ScholarDigital Library
- Steve Branson, Catherine Wah, Florian Schroff, Boris Babenko, Peter Welinder, Pietro Perona, and Serge Belongie. 2010. Visual recognition with humans in the loop. In Proceedings of the 11th European conference on Computer vision: Part IV - ECCV '10, Vol. 6314. Springer Berlin Heidelberg, Berlin, Heidelberg, 438--451. DOI: http://dx.doi.org/10.1007/978--3--642--15561--1Google ScholarCross Ref
- Lydia B. Chilton, Greg Little, Darren Edge, Daniel S. Weld, and James A. Landay. 2013. Cascade: Crowdsourcing taxonomy creation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '13. 1999. DOI: http://dx.doi.org/10.1145/2470654.2466265 Google ScholarDigital Library
- CrowdFlower. 2016. CrowdFlower Homepage. (2016). https://www.crowdflower.com.Google Scholar
- Jia Deng, Olga Russakovsky, Jonathan Krause, Michael S. Bernstein, Alex Berg, and Li Fei-Fei. 2014. Scalable multi-label annotation. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems CHI '14. ACM Press, New York, New York, USA, 3099--3102. DOI: http://dx.doi.org/10.1145/2556288.2557011 Google ScholarDigital Library
- Daniel Conrad Halbert. 1984. Programming by example. Ph.D. Dissertation. University of California, Berkeley.Google Scholar
- Kotaro Hara, Vicki Le, and Jon Froehlich. 2013. Combining Crowdsourcing and Google Street View to Identify Street-level Accessibility Problems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 631--640. DOI: http://dx.doi.org/10.1145/2470654.2470744 Google ScholarDigital Library
- Juho Kim, Phu Tran Nguyen, Sarah Weir, Philip J. Guo, Robert C. Miller, and Krzysztof Z. Gajos. 2014. Crowdsourcing step-by-step information extraction to enhance existing how-to videos. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems CHI '14. 4017--4026. DOI: http://dx.doi.org/10.1145/2556288.2556986 Google ScholarDigital Library
- Aniket Kittur, Susheel Khamkar, Paul André, and Robert Kraut. 2012. CrowdWeaver. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work CSCW '12. ACM, 1033. DOI: http://dx.doi.org/10.1145/2145204.2145357 Google ScholarDigital Library
- Aniket Kittur, Boris Smus, Susheel Khamkar, and Robert E. Kraut. 2011. CrowdForge: crowdsourcing complex work. In Proceedings of the 24th annual ACM symposium on User interface software and technology UIST '11. ACM, ACM Press, New York, New York, USA, 43. DOI: http://dx.doi.org/10.1145/2047196.2047202 Google ScholarDigital Library
- Anand Kulkarni, Matthew Can, and Björn Hartmann. 2012. Collaboratively crowdsourcing workflows with turkomatic. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work CSCW '12. ACM, 1003. DOI: http://dx.doi.org/10.1145/2145204.2145354 Google ScholarDigital Library
- Walter S. Lasecki, Mitchell Gordon, Steven P. Dow, and Jeffrey P. Bigham. 2014. Glance: enabling rapid interactions with data using the crowd. In Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems CHI EA '14. ACM Press, New York, New York, USA, 511--514. DOI: http://dx.doi.org/10.1145/2559206.2574817 Google ScholarDigital Library
- James R. Lewis. 1995. IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. International Journal of Human-Computer Interaction 7, 1 (jan 1995), 57--78. DOI: http://dx.doi.org/10.1080/10447319509526110 Google ScholarDigital Library
- Greg Little, Lydia B Chilton, Max Goldman, and Robert C Miller. 2010. TurKit: human computation algorithms on mechanical turk. In Proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10. ACM, ACM Press, New York, New York, USA, 57. DOI: http://dx.doi.org/10.1145/1866029.1866040 Google ScholarDigital Library
- Jon Noronha, Eric Hysen, Haoqi Zhang, and Krzysztof Z Gajos. 2011. Platemate: crowdsourcing nutritional analysis from food photographs. In Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11. ACM Press, New York, New York, USA, 1. DOI: http://dx.doi.org/10.1145/2047196.2047198 Google ScholarDigital Library
- Hao Su, Jia Deng, and Li Fei-fei. 2012. Crowdsourcing Annotations for Visual Object Detection. In Proc. AAAI Human Computation? 12. 40--46.Google Scholar
- Jiannan Wang, Tim Kraska, Michael J Franklin, and Jianhua Feng. 2012. CrowdER: Crowdsourcing Entity Resolution. Proceedings of the VLDB Endowment 5 (aug 2012), 1483--1494. http://arxiv.org/abs/1208.1927 Google ScholarDigital Library
Index Terms
- ReTool: Interactive Microtask and Workflow Design through Demonstration
Recommendations
TaskMate: A Mechanism to Improve the Quality of Instructions in Crowdsourcing
WWW '19: Companion Proceedings of The 2019 World Wide Web ConferenceDeveloping instructions for microtask crowd workers requires time to ensure consistent interpretations by crowd workers. Even with substantial effort, workers may still misinterpret the instructions due to ambiguous language and structure in the task ...
Modus Operandi of Crowd Workers: The Invisible Role of Microtask Work Environments
The ubiquity of the Internet and the widespread proliferation of electronic devices has resulted in flourishing microtask crowdsourcing marketplaces, such as Amazon MTurk. An aspect that has remained largely invisible in microtask crowdsourcing is that ...
Workflow transparency in a microtask marketplace
GROUP '12: Proceedings of the 2012 ACM International Conference on Supporting Group WorkInterdependent tasks in Mechanical Turk (MTurk) can be managed efficiently with a workflow, a sequence of tasks through which work passes to its completion. We ask if workers should be informed about the workflow, which we call workflow transparency. ...
Comments