skip to main content
10.1145/3025453.3025969acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Honorable Mention

ReTool: Interactive Microtask and Workflow Design through Demonstration

Authors Info & Claims
Published:02 May 2017Publication History

ABSTRACT

In addition to simple form filling, there is an increasing need for crowdsourcing workers to perform freeform interactions directly on content in microtask crowdsourcing (e.g. proofreading articles or specifying object boundary in an image). Such microtasks are often organized within well-designed workflows to optimize task quality and workload distribution. However, designing and implementing the interface and workflow for such microtasks is challenging because it typically requires programming knowledge and tedious manual effort. We present ReTool, a web-based tool for requesters to design and publish interactive microtasks and workflows by demonstrating the microtasks for text and image content. We evaluated ReTool against a task-design tool from a popular crowdsourcing platform and showed the advantages of ReTool over the existing approach.

Skip Supplemental Material Section

Supplemental Material

pn4168-file3.mp4

mp4

64.8 MB

pn4168p.mp4

mp4

1.5 MB

References

  1. Amazon. 2016. Amazon Mechanical Turk Homepage. (2016). https://www.mturk.com.Google ScholarGoogle Scholar
  2. Amazon. 2016. Developer Tools and Software Development Kits (SDKs) of Amazon Mechanical Turk. (2016). https://requester.mturk.com/developer/tools.Google ScholarGoogle Scholar
  3. Paul André, Aniket Kittur, and Steven P. Dow. 2014. Crowd synthesis: extracting categories and clusters from complex data. In Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing CSCW '14. ACM Press, New York, New York, USA, 989--998. DOI: http://dx.doi.org/10.1145/2531602.2531653 Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Michael S Bernstein, Joel Brandt, Robert C Miller, and David R Karger. 2011. Crowds in Two Seconds: Enabling Realtime Crowd-powered Interfaces. In Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11. ACM Press, New York, New York, USA, 33. DOI: http://dx.doi.org/10.1145/2047196.2047201 Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. M S Bernstein, G Little, R C Miller, Björn Hartmann, Mark S. Ackerman, David R. Karger, David Crowell, and Katrina Panovich. 2010. Soylent: a word processor with a crowd inside. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 313--322. DOI: http://dx.doi.org/10.1145/1866029.1866078 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Jeffrey P Bigham, Samual Samuel White, Tom Yeh, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C Robin Miller, Aubrey Tatarowicz, and Brandyn White. 2010. VizWiz: Nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10. 333--342. DOI: http://dx.doi.org/10.1145/1866029.1866080 Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Steve Branson, Catherine Wah, Florian Schroff, Boris Babenko, Peter Welinder, Pietro Perona, and Serge Belongie. 2010. Visual recognition with humans in the loop. In Proceedings of the 11th European conference on Computer vision: Part IV - ECCV '10, Vol. 6314. Springer Berlin Heidelberg, Berlin, Heidelberg, 438--451. DOI: http://dx.doi.org/10.1007/978--3--642--15561--1Google ScholarGoogle ScholarCross RefCross Ref
  8. Lydia B. Chilton, Greg Little, Darren Edge, Daniel S. Weld, and James A. Landay. 2013. Cascade: Crowdsourcing taxonomy creation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '13. 1999. DOI: http://dx.doi.org/10.1145/2470654.2466265 Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. CrowdFlower. 2016. CrowdFlower Homepage. (2016). https://www.crowdflower.com.Google ScholarGoogle Scholar
  10. Jia Deng, Olga Russakovsky, Jonathan Krause, Michael S. Bernstein, Alex Berg, and Li Fei-Fei. 2014. Scalable multi-label annotation. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems CHI '14. ACM Press, New York, New York, USA, 3099--3102. DOI: http://dx.doi.org/10.1145/2556288.2557011 Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Daniel Conrad Halbert. 1984. Programming by example. Ph.D. Dissertation. University of California, Berkeley.Google ScholarGoogle Scholar
  12. Kotaro Hara, Vicki Le, and Jon Froehlich. 2013. Combining Crowdsourcing and Google Street View to Identify Street-level Accessibility Problems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 631--640. DOI: http://dx.doi.org/10.1145/2470654.2470744 Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Juho Kim, Phu Tran Nguyen, Sarah Weir, Philip J. Guo, Robert C. Miller, and Krzysztof Z. Gajos. 2014. Crowdsourcing step-by-step information extraction to enhance existing how-to videos. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems CHI '14. 4017--4026. DOI: http://dx.doi.org/10.1145/2556288.2556986 Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Aniket Kittur, Susheel Khamkar, Paul André, and Robert Kraut. 2012. CrowdWeaver. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work CSCW '12. ACM, 1033. DOI: http://dx.doi.org/10.1145/2145204.2145357 Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Aniket Kittur, Boris Smus, Susheel Khamkar, and Robert E. Kraut. 2011. CrowdForge: crowdsourcing complex work. In Proceedings of the 24th annual ACM symposium on User interface software and technology UIST '11. ACM, ACM Press, New York, New York, USA, 43. DOI: http://dx.doi.org/10.1145/2047196.2047202 Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Anand Kulkarni, Matthew Can, and Björn Hartmann. 2012. Collaboratively crowdsourcing workflows with turkomatic. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work CSCW '12. ACM, 1003. DOI: http://dx.doi.org/10.1145/2145204.2145354 Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Walter S. Lasecki, Mitchell Gordon, Steven P. Dow, and Jeffrey P. Bigham. 2014. Glance: enabling rapid interactions with data using the crowd. In Proceedings of the extended abstracts of the 32nd annual ACM conference on Human factors in computing systems CHI EA '14. ACM Press, New York, New York, USA, 511--514. DOI: http://dx.doi.org/10.1145/2559206.2574817 Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. James R. Lewis. 1995. IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. International Journal of Human-Computer Interaction 7, 1 (jan 1995), 57--78. DOI: http://dx.doi.org/10.1080/10447319509526110 Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Greg Little, Lydia B Chilton, Max Goldman, and Robert C Miller. 2010. TurKit: human computation algorithms on mechanical turk. In Proceedings of the 23nd annual ACM symposium on User interface software and technology - UIST '10. ACM, ACM Press, New York, New York, USA, 57. DOI: http://dx.doi.org/10.1145/1866029.1866040 Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Jon Noronha, Eric Hysen, Haoqi Zhang, and Krzysztof Z Gajos. 2011. Platemate: crowdsourcing nutritional analysis from food photographs. In Proceedings of the 24th annual ACM symposium on User interface software and technology - UIST '11. ACM Press, New York, New York, USA, 1. DOI: http://dx.doi.org/10.1145/2047196.2047198 Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Hao Su, Jia Deng, and Li Fei-fei. 2012. Crowdsourcing Annotations for Visual Object Detection. In Proc. AAAI Human Computation? 12. 40--46.Google ScholarGoogle Scholar
  22. Jiannan Wang, Tim Kraska, Michael J Franklin, and Jianhua Feng. 2012. CrowdER: Crowdsourcing Entity Resolution. Proceedings of the VLDB Endowment 5 (aug 2012), 1483--1494. http://arxiv.org/abs/1208.1927 Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. ReTool: Interactive Microtask and Workflow Design through Demonstration

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
      May 2017
      7138 pages
      ISBN:9781450346559
      DOI:10.1145/3025453

      Copyright © 2017 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 2 May 2017

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '17 Paper Acceptance Rate600of2,400submissions,25%Overall Acceptance Rate6,199of26,314submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader