skip to main content
10.1145/2908131.2908209acmconferencesArticle/Chapter ViewAbstractPublication PageswebsciConference Proceedingsconference-collections
extended-abstract

Estimating domain specificity for effective crowdsourcing of link prediction and schema mapping

Published: 22 May 2016 Publication History

Abstract

Crowdsourcing has been widely adopted in research and practice over the last decade. In this work, we first investigate the extent to which crowd workers can substitute expert-based judgments in the task of link prediction and schema mapping, which is the creation of explicit links between resources on the Semantic Web at the instance and schema level. This is important since human input is required to evaluate and improve automated approaches for these tasks. We present a novel method to assess the inherent specificity of the link prediction task, and the impact of task specificity on quality of the results. We propose a Wikipedia-based mechanism to estimate specificity and show the influence of concept familiarity in producing high quality link prediction. Our findings indicate that the effectiveness of crowdsourcing the task of link prediction can improve by estimating the specificity.

References

[1]
V. De Boer, M. Hildebrand, L. Aroyo, P. De Leenheer, C. Dijkshoorn, B. Tesfa, and G. Schreiber. Nichesourcing: Harnessing the power of crowds of experts. In Knowledge Engineering and Knowledge Management, pages 16--20. Springer, 2012.
[2]
H. Kajino, Y. Tsuboi, I. Sato, and H. Kashima. Learning from crowds and experts. In Proc of the Human Computation Workshop, pages 107--113, 2012.
[3]
C. Lu, J.-r. Park, and X. Hu. User tags versus expert-assigned subject terms: A comparison of librarything tags and library of congress subject headings. Journal of information science, 36(6):763--779, 2010.

Index Terms

  1. Estimating domain specificity for effective crowdsourcing of link prediction and schema mapping

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      WebSci '16: Proceedings of the 8th ACM Conference on Web Science
      May 2016
      392 pages
      ISBN:9781450342087
      DOI:10.1145/2908131
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 22 May 2016

      Check for updates

      Author Tags

      1. crowd workers
      2. crowdsourcing
      3. experts
      4. link prediction
      5. nichesourcing
      6. schema mapping

      Qualifiers

      • Extended-abstract

      Conference

      WebSci '16
      Sponsor:
      WebSci '16: ACM Web Science Conference
      May 22 - 25, 2016
      Hannover, Germany

      Acceptance Rates

      WebSci '16 Paper Acceptance Rate 13 of 70 submissions, 19%;
      Overall Acceptance Rate 245 of 933 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 78
        Total Downloads
      • Downloads (Last 12 months)1
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 19 Feb 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media