skip to main content
10.1145/3126673.3126683acmotherconferencesArticle/Chapter ViewAbstractPublication PagesopencollabConference Proceedingsconference-collections
extended-abstract

A Crowdsourcing Practices Framework for Science Funding Call Processes

Published:23 August 2017Publication History

ABSTRACT

Public scientific research funding agencies (funding agencies) are charged with the task of implementing government science policy and identifying research projects worthy of funding. They play an important role in creating value for society through funding research and informing research policy. However, the work of funding agencies in recent years has been hampered by various challenges in call processes. This research proposes crowdsourcing as a potential solution for funding agencies. Information systems research has engaged with crowdsourcing and the open innovation phenomenon. Crowdsourcing has been utilised by both private organisations and governments in the seeking solutions to similar types of challenges. Despite this fact, no crowdsourcing frameworks have been adapted to address the types of challenges faced by funding agencies in call processes. This research seeks to identify challenges faced by funding agencies for the purposes adapting a crowdsourcing practices framework to address these challenges.

References

  1. Dimitra Anastasiou and Rajat Gupta, 2011. Comparison of crowdsourcing translation with Machine Translation. Journal of Information Science 37, 6, 637--659. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Sebastian K. Boell and Dubravka Cecez-Kecmanovic, 2010. Literature reviews and the hermeneutic circle. Australian Academic & Research Libraries 41, 2, 129--144.Google ScholarGoogle ScholarCross RefCross Ref
  3. Sebastian K. Boell and Dubravka Cecez-Kecmanovic, 2014. A hermeneutic approach for conducting literature reviews and literature searches. Communications of the Association for Information Systems 34, 1, 257--286.Google ScholarGoogle ScholarCross RefCross Ref
  4. Daren C. Brabham, 2013. Using Crowdsourcing In Government. In IBM Center for The Business of Government.Google ScholarGoogle Scholar
  5. Charles W. Churchman, 1967. Wicked problems. Management Science, 14, 4, 141--142.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Gary M. Crawley and Eoin. O'Sullivan, 2007. How to Write a Research Proposal and Succeed. Imperial College Press, London, UK.Google ScholarGoogle Scholar
  7. Andrew. Ede and Lesley B. Cormack, 2012. A history of science in society: From philosophy to utility. University of Toronto Press, CA.Google ScholarGoogle Scholar
  8. Thomas Erickson, 2011. SIGCHI Conference on Human Factors in Computing Systems (CHI '11), Some Thoughts on a Framework for Crowdsourcing, 1--4.Google ScholarGoogle Scholar
  9. David Geiger, Stefan Seedorf, Thimo Schulze, Robert C. Nickerson, and Martin Schader, 2011. Managing the Crowd: Towards a Taxonomy of Crowdsourcing Processes. In Americas Conference on Information Systems (AMCIS), Detroit, Michigan.Google ScholarGoogle Scholar
  10. B. Jeff Howe, 2006. The Rise of Crowdsourcing. In Wired, 1--5.Google ScholarGoogle Scholar
  11. Marshall W. Kreuter, Christopher De Rosa, Elizabeth H. Howze, and Grant T. Baldwin, 2004. Understanding wicked problems: a key to advancing environmental health promotion. Health Education & Behavior 31, 4, 441--454.Google ScholarGoogle ScholarCross RefCross Ref
  12. Anne B. Lane, 2016. Boaty McBoatface poll shows how not to do community consultation. The Conversation, 18.Google ScholarGoogle Scholar
  13. Sonja Marjanovic, Caroline Fry, and Joanna Chataway, 2012. Crowdsourcing based business models: In search of evidence for innovation 2.0. Science and Public Policy 39, 3, 318--332.Google ScholarGoogle ScholarCross RefCross Ref
  14. Benedikt Morschheuser, Juho Hamari and Jonna Koivisto, 2016. Gamification in crowdsourcing: A review. In proceedings of the 49th Annual Hawaii International Conference on System Sciences (HICSS), Hawaii, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Andrea Wiggins and Kevin Crowston, 2011. From conservation to crowdsourcing: A typology of citizen science. In System Sciences (HICSS), 2011 44th Hawaii international conference on IEEE, 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Robert K. Yin, 2013. Case study research: Design and methods. Sage publications, Los Angeles and London.Google ScholarGoogle Scholar

Index Terms

  1. A Crowdsourcing Practices Framework for Science Funding Call Processes

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      OpenSym '17: Proceedings of the 13th International Symposium on Open Collaboration Companion
      August 2017
      71 pages
      ISBN:9781450354172
      DOI:10.1145/3126673

      Copyright © 2017 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 23 August 2017

      Check for updates

      Qualifiers

      • extended-abstract
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate108of195submissions,55%
    • Article Metrics

      • Downloads (Last 12 months)2
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader