skip to main content
10.1145/2911451.2914678acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

The LExR Collection for Expertise Retrieval in Academia

Published:07 July 2016Publication History

ABSTRACT

Expertise retrieval has been the subject of intense research over the past decade, particularly with the public availability of benchmark test collections for expertise retrieval in enterprises. Another domain which has seen comparatively less research on expertise retrieval is academic search. In this paper, we describe the Lattes Expertise Retrieval (LExR) test collection for research on academic expertise retrieval. LExR has been designed to provide a large-scale benchmark for two complementary expertise retrieval tasks, namely, expert profiling and expert finding. Unlike currently available test collections, which fully support only one of these tasks, LExR provides graded relevance judgments performed by expert judges separately for each task. In addition, LExR is both cross-organization and cross-area, encompassing candidate experts from all areas of knowledge working in research institutions all over Brazil. As a result, it constitutes a valuable resource for fostering new research directions on expertise retrieval in an academic setting.

References

  1. P. Bailey, N. Craswell, I. Soboroff, and A. P. de Vries. The CSIRO enterprise search test collection. SIGIR Forum, 41(2):42--45, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. K. Balog, T. Bogers, L. Azzopardi, M. de Rijke, and A. van den Bosch. Broad expertise retrieval in sparse data environments. In Proc. of SIGIR, pages 551--558, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. K. Balog, Y. Fang, M. de Rijke, P. Serdyukov, and L. Si. Expertise retrieval. Found. Trends Inf. Retr., 6(2-3):127--256, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. R. Berendsen, M. de Rijke, K. Balog, T. Bogers, and A. van den Bosch. On the assessment of expertise profiles. J. Am. Soc. Inf. Sci. Technol., 64(10):2024--2044, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  5. N. Craswell, A. P. de Vries, and I. Soboroff. Overview of the TREC 2005 Enterprise track. In Proc. of TREC, 2005.Google ScholarGoogle Scholar
  6. J. Lane. Let's make science metrics more scientific. Nature, 464(7288):488--489, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  7. V. Mangaravite and R. L. T. Santos. On information-theoretic document-person associations for expert search in academia. In Proc. of SIGIR, 2016. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. I. S. Ribeiro, R. L. T. Santos, M. A. Gonçalves, and A. H. F. Laender. On tag recommendation for expertise profiling: A case study in the scientific domain. In Proc. of WSDM, pages 189--198, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. M. Sanderson. Test collection based evaluation of information retrieval systems. Found. Trends Inf. Retr., 4(4):247--375, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  10. J. Tang, J. Zhang, L. Yao, J. Li, L. Zhang, and Z. Su. ArnetMiner: extraction and mining of academic social networks. In Proc. of KDD, pages 990--998, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. The LExR Collection for Expertise Retrieval in Academia

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SIGIR '16: Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval
        July 2016
        1296 pages
        ISBN:9781450340694
        DOI:10.1145/2911451

        Copyright © 2016 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 7 July 2016

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • short-paper

        Acceptance Rates

        SIGIR '16 Paper Acceptance Rate62of341submissions,18%Overall Acceptance Rate792of3,983submissions,20%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader