skip to main content
10.1145/2631775.2631819acmconferencesArticle/Chapter ViewAbstractPublication PageshtConference Proceedingsconference-collections
short-paper

A taxonomy of microtasks on the web

Published: 01 September 2014 Publication History

Abstract

Nowadays, a substantial number of people are turning to crowdsourcing, in order to solve tasks that require human intervention. Despite a considerable amount of research done in the field of crowdsourcing, existing works fall short when it comes to classifying typically crowdsourced tasks. Understanding the dynamics of the tasks that are crowdsourced and the behaviour of workers, plays a vital role in efficient task-design. In this paper, we propose a two-level categorization scheme for tasks, based on an extensive study of 1000 workers on CrowdFlower. In addition, we present insights into certain aspects of crowd behaviour; the task affinity of workers, effort exerted by workers to complete tasks of various types, and their satisfaction with the monetary incentives.

References

[1]
Corbin, J., and Strauss, A. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage, 2008.
[2]
Geiger, D., Seedorf, S., Schulze, T., Nickerson, R. C., and Schader, M. Managing the crowd: Towards a taxonomy of crowdsourcing processes. In AMCIS (2011).
[3]
Kazai, G., Kamps, J., and Milic-Frayling, N. Worker types and personality traits in crowdsourcing relevance labels. In Proceedings of the 20th ACM international conference on Information and knowledge management (2011), ACM, pp. 1941--1944.
[4]
Marshall, C. C., and Shipman, F. M. Experiences surveying the crowd: Re ections on methods, participation, and reliability. In Proceedings of the 5th Annual ACM Web Science Conference (New York, NY, USA, 2013), WebSci '13, ACM, pp. 234--243.
[5]
Mason, W., and Watts, D. J. Financial incentives and the performance of crowds. ACM SigKDD Explorations Newsletter 11, 2 (2010), 100--108.
[6]
Ross, J., Irani, L., Silberman, M., Zaldivar, A., and Tomlinson, B. Who are the crowdworkers?: shifting demographics in mechanical turk. In CHI'10 Extended Abstracts on Human Factors in Computing Systems (2010), ACM, pp. 2863--2872.
[7]
Willett, W., Heer, J., and Agrawala, M. Strategies for crowdsourcing social data analysis. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (New York, NY, USA, 2012), CHI '12, ACM, pp. 227--236.
[8]
Yuen, M.-C., King, I., and Leung, K.-S. A survey of crowdsourcing systems. In PASSAT/SocialCom 2011, Privacy, Security, Risk and Trust (PASSAT), 2011 IEEE Third International Conference on and 2011 IEEE Third International Conference on Social Computing (SocialCom), Boston, MA, USA, 9--11 Oct., 2011 (2011), pp. 766--773.

Cited By

View all
  • (2024)Gamification Techniques and Contribution Filtering in Crowdsourcing Micro-Task ApplicationsJournal on Interactive Systems10.5753/jis.2024.372715:1(401-416)Online publication date: 15-May-2024
  • (2024)Snapper: Accelerating Bounding Box Annotation in Object Detection Tasks with Find-and-Snap ToolingProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645162(471-488)Online publication date: 18-Mar-2024
  • (2024)DECI: The 2nd Tutorial on Designing Effective Conversational InterfacesAdjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3658531(5-8)Online publication date: 27-Jun-2024
  • Show More Cited By

Index Terms

  1. A taxonomy of microtasks on the web

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HT '14: Proceedings of the 25th ACM conference on Hypertext and social media
    September 2014
    346 pages
    ISBN:9781450329545
    DOI:10.1145/2631775
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 September 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affinity
    2. crowdsourcing
    3. effort
    4. incentive
    5. microtasks
    6. taxonomy

    Qualifiers

    • Short-paper

    Conference

    HT '14
    Sponsor:

    Acceptance Rates

    HT '14 Paper Acceptance Rate 49 of 86 submissions, 57%;
    Overall Acceptance Rate 378 of 1,158 submissions, 33%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)41
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 19 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Gamification Techniques and Contribution Filtering in Crowdsourcing Micro-Task ApplicationsJournal on Interactive Systems10.5753/jis.2024.372715:1(401-416)Online publication date: 15-May-2024
    • (2024)Snapper: Accelerating Bounding Box Annotation in Object Detection Tasks with Find-and-Snap ToolingProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645162(471-488)Online publication date: 18-Mar-2024
    • (2024)DECI: The 2nd Tutorial on Designing Effective Conversational InterfacesAdjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3631700.3658531(5-8)Online publication date: 27-Jun-2024
    • (2024)"Are we all in the same boat?" Customizable and Evolving Avatars to Improve Worker Engagement and Foster a Sense of Community in Online Crowd WorkProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642429(1-26)Online publication date: 11-May-2024
    • (2024)Explaining crowdworker behaviour through computational rationalityBehaviour & Information Technology10.1080/0144929X.2024.232961644:3(552-573)Online publication date: 24-Apr-2024
    • (2024)A Crowdsourcing Approach for Identifying Potential Stereotypes in the Collected DataSocial Computing and Social Media10.1007/978-3-031-61281-7_1(3-18)Online publication date: 1-Jun-2024
    • (2023)A Model for Cognitive Personalization of Microtask DesignSensors10.3390/s2307357123:7(3571)Online publication date: 29-Mar-2023
    • (2023)Designing for Hybrid Intelligence: A Taxonomy and Survey of Crowd-Machine InteractionApplied Sciences10.3390/app1304219813:4(2198)Online publication date: 8-Feb-2023
    • (2023)What You Show is What You Get! Gestures for Microtask CrowdsourcingCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584175(255-258)Online publication date: 27-Mar-2023
    • (2023)DECI: A Tutorial on Designing Effective Conversational InterfacesCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584165(187-189)Online publication date: 27-Mar-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media