skip to main content
10.1145/2661829.2661946acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article

Competitive Game Designs for Improving the Cost Effectiveness of Crowdsourcing

Published: 03 November 2014 Publication History

Abstract

Crowd based online work is leveraged in a variety of applications such as semantic annotation of images, translation of texts in foreign languages, and labeling of training data for machine learning models. However, annotating large amounts of data through crowdsourcing can be slow and costly. In order to improve both cost and time efficiency of crowdsourcing we examine alternative reward mechanisms compared to the "Pay-per-HIT" scheme commonly used in platforms such as Amazon Mechanical Turk. To this end, we explore a wide range of monetary reward schemes that are inspired by the success of competitions, lotteries, and games of luck. Our large-scale experimental evaluation with an overall budget of more than 1,000 USD and with 2,700 hours of work spent by crowd workers demonstrates that our alternative reward mechanisms are well accepted by online workers and lead to substantial performance boosts.

References

[1]
GamifIR '14: 1st International Workshop on Gamification for Information Retrieval, NY, USA.
[2]
TREC Crowdsourcing Task 2013. https://sites.google.com/site/treccrowd/home.
[3]
O. Alonso and R. Baeza-Yates. Design and implementation of relevance assessments using crowdsourcing. In ECIR'11.
[4]
O. Alonso and S. Mizzaro. Using crowdsourcing for trec relevance assessment. Information Processing and Management, 2012.
[5]
N. Archak. Money, glory and cheap talk: Analyzing strategic behavior of contestants in simultaneous crowdsourcing contests on topcoder.com. In WWW'10.
[6]
N. Archak and A. Sundararajan. Optimal design of crowdsourcing contests. In ICIS'09.
[7]
R. Cavallo and S. Jain. Efficient crowdsourcing contests. In AAMAS'12.
[8]
R. Cavallo and S. Jain. Winner-take-all crowdsourcing contests with stochastic production. In HCOMP'13.
[9]
L. E. Celis, S. Roy, and V. Mishra. Lottery-based payment mechanism for microtasks. In HCOMP'13.
[10]
D. DiPalantino and M. Vojnovic. Crowdsourcing and all-pay auctions. In EC'09.
[11]
C. Eickhoff, C. G. Harris, A. P. de Vries, and P. Srinivasan. Quality through flow and immersion: Gamifying crowdsourced relevance assessments. In SIGIR'12.
[12]
J. He, M. Bron, L. Azzopardi, and A. de Vries. Studying user browsing behavior through gamified search tasks. In GamifIR '14.
[13]
H. Jiang and S. Matsubara. Improving crowdsourcing efficiency based on division strategy. WI'12.
[14]
J. Joukhador, A. Blaszczynski, and F. Maccallum. Superstitious beliefs in gambling among problem and non-problem gamblers: Preliminary data. Journal of Gambling Studies, 2004.
[15]
G. Kazai. In search of quality in crowdsourcing for search engine evaluation. In ECIR'11.
[16]
G. Kazai, J. Kamps, and N. Milic-Frayling. Worker types and personality traits in crowdsourcing relevance labels. In CIKM'11.
[17]
N. Kumar, A. C. Berg, P. N. Belhumeur, and S. K. Nayar. Attribute and Simile Classifiers for Face Verification. In ICCV'09.
[18]
D. Lam. An exploratory study of gambling motivations and their impact on the purchase frequencies of various gambling products. Psychology and Marketing, 2007.
[19]
W. Mason and D. J. Watts. Financial incentives and the "performance of crowds". SIGKDD'09.
[20]
D. Pothineni, P. Mishra, A. Rasheed, and D. Sundararajan. Incentive design to mould online behavior: A game mechanics perspective. In GamifIR'14.
[21]
J. P. Rula, V. Navda, F. E. Bustamante, R. Bhagwan, and S. Guha. No "one-size fits all": Towards a principled approach for incentives in mobile crowdsourcing. In HotMobile '14.
[22]
N. Savage. Gaining wisdom from crowds. Magazine Communications of the ACM, 2012.
[23]
K. Stoddart. Behind the scenes of crowdsourcing: Who are crowd workers?, 2012. Available at http://www.crowdsource.com/blog/2012/11/behind-the-scenes-of-crowdsourcing-who-are-crowd-workers/.
[24]
J. Surowiecki. The Wisdom of Crowds. Anchor, 2005.
[25]
L. von Ahn and L. Dabbish. Designing games with a purpose. Magazine Communications of the ACM, 2008.
[26]
L. von Ahn and L. Dabbish. Labeling images with a computer game. In CHI'04.
[27]
P. Welinder and P. Perona. Online crowdsourcing: Rating annotators and obtaining cost-effective labels. In CVPRW'10.
[28]
J. Yang, L. A. Adamic, and M. S. Ackerman. Crowdsourcing and knowledge sharing: Strategic user behavior on taskcn. In EC'08.
[29]
M.-C. Yuen, I. King, and K.-S. Leung. A survey of crowdsourcing systems. In Passat Workshop at socialcom'11.

Cited By

View all
  • (2024)"Are we all in the same boat?" Customizable and Evolving Avatars to Improve Worker Engagement and Foster a Sense of Community in Online Crowd WorkProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642429(1-26)Online publication date: 11-May-2024
  • (2022)Business Simulation Games in Higher Education: A Systematic Review of Empirical ResearchHuman Behavior and Emerging Technologies10.1155/2022/15787912022(1-28)Online publication date: 22-Nov-2022
  • (2022)Knowledge Learning With Crowdsourcing: A Brief Review and Systematic PerspectiveIEEE/CAA Journal of Automatica Sinica10.1109/JAS.2022.1054349:5(749-762)Online publication date: May-2022
  • Show More Cited By

Index Terms

  1. Competitive Game Designs for Improving the Cost Effectiveness of Crowdsourcing

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CIKM '14: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management
        November 2014
        2152 pages
        ISBN:9781450325981
        DOI:10.1145/2661829
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 03 November 2014

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. competitions
        2. crowdsourcing
        3. lotteries
        4. reward schemes

        Qualifiers

        • Research-article

        Funding Sources

        Conference

        CIKM '14
        Sponsor:

        Acceptance Rates

        CIKM '14 Paper Acceptance Rate 175 of 838 submissions, 21%;
        Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

        Upcoming Conference

        CIKM '25

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)24
        • Downloads (Last 6 weeks)2
        Reflects downloads up to 19 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)"Are we all in the same boat?" Customizable and Evolving Avatars to Improve Worker Engagement and Foster a Sense of Community in Online Crowd WorkProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642429(1-26)Online publication date: 11-May-2024
        • (2022)Business Simulation Games in Higher Education: A Systematic Review of Empirical ResearchHuman Behavior and Emerging Technologies10.1155/2022/15787912022(1-28)Online publication date: 22-Nov-2022
        • (2022)Knowledge Learning With Crowdsourcing: A Brief Review and Systematic PerspectiveIEEE/CAA Journal of Automatica Sinica10.1109/JAS.2022.1054349:5(749-762)Online publication date: May-2022
        • (2022)A Team Crowdsourcing Method Combining Competition and Collaboration2022 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech)10.1109/DASC/PiCom/CBDCom/Cy55231.2022.9927987(1-6)Online publication date: 12-Sep-2022
        • (2021)Generalized Lottery Trees: Budget-Balanced Incentive Tree Mechanisms for CrowdsourcingIEEE Transactions on Mobile Computing10.1109/TMC.2020.297945920:7(2379-2394)Online publication date: 1-Jul-2021
        • (2021)Engaging Drivers in Ride Hailing via Competition: A Case Study with Arena2021 22nd IEEE International Conference on Mobile Data Management (MDM)10.1109/MDM52706.2021.00016(19-28)Online publication date: Jun-2021
        • (2020)SciBabel: a system for crowd-sourced validation of automatic translations of scientific textsGenomics & Informatics10.5808/GI.2020.18.2.e2118:2(e21)Online publication date: 30-Jun-2020
        • (2020)Integrating Gamification and Social Interaction into an AR-Based Gamified Point SystemMultimodal Technologies and Interaction10.3390/mti40300514:3(51)Online publication date: 13-Aug-2020
        • (2020)Blockchain-Enabled Federated Learning With Mechanism DesignIEEE Access10.1109/ACCESS.2020.30430378(219744-219756)Online publication date: 2020
        • (2019)Beyond Monetary IncentivesACM Transactions on Social Computing10.1145/33217002:2(1-31)Online publication date: 13-Jun-2019
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media