skip to main content
10.1145/1835449.1835603acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
poster

Comparing click-through data to purchase decisions for retrieval evaluation

Published: 19 July 2010 Publication History

Abstract

Traditional retrieval evaluation uses explicit relevance judgments which are expensive to collect. Relevance assessments inferred from implicit feedback such as click-through data can be collected inexpensively, but may be less reliable. We compare assessments derived from click-through data to another source of implicit feedback that we assume to be highly indicative of relevance: purchase decisions. Evaluating retrieval runs based on a log of an audio-visual archive, we find agreement between system rankings and purchase decisions to be surprisingly high.

References

[1]
S. P. Harter. Variations in relevance assessments and the measurement of retrieval effectiveness. JASIS&T, 47(1):37--49, 1996.
[2]
B. Huurnink, L. Hollink, W. van den Heuvel, and M. de Rijke. The search behavior of media professionals at an audiovisual archive: A transaction log analysis. JASIS&T, 2010.
[3]
T. Joachims, L. Granka, B. Pan, H. Hembrooke, and G. Gay. Accurately interpreting clickthrough data as implicit feedback. In SIGIR '05, pages 154--161, 2005. ACM.
[4]
J. Kamps, M. Koolen, and A. Trotman. Comparative analysis of clicks and judgments for IR evaluation. In WSCD '09, pages 80--87, 2009. ACM.
[5]
F. Radlinski, M. Kurup, and T. Joachims. How does clickthrough data reflect retrieval quality? In CIKM '08, pages 43--52, 2008. ACM.
[6]
I. Ruthven. Integrating approaches to relevance. New directions in cognitive information retrieval, pages 61--80, 2005. Springer.

Cited By

View all
  • (2016)Online Evaluation for Information RetrievalFoundations and Trends in Information Retrieval10.1561/150000005110:1(1-117)Online publication date: 1-Jun-2016
  • (2013)Fidelity, Soundness, and Efficiency of Interleaved Comparison MethodsACM Transactions on Information Systems10.1145/2536736.253673731:4(1-43)Online publication date: 1-Nov-2013
  • (2013)Pseudo test collections for training and tuning microblog rankersProceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval10.1145/2484028.2484063(53-62)Online publication date: 28-Jul-2013
  • Show More Cited By

Index Terms

  1. Comparing click-through data to purchase decisions for retrieval evaluation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '10: Proceedings of the 33rd international ACM SIGIR conference on Research and development in information retrieval
    July 2010
    944 pages
    ISBN:9781450301534
    DOI:10.1145/1835449
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 July 2010

    Check for updates

    Author Tags

    1. evaluation
    2. query log analysis

    Qualifiers

    • Poster

    Conference

    SIGIR '10
    Sponsor:

    Acceptance Rates

    SIGIR '10 Paper Acceptance Rate 87 of 520 submissions, 17%;
    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)2
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 02 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2016)Online Evaluation for Information RetrievalFoundations and Trends in Information Retrieval10.1561/150000005110:1(1-117)Online publication date: 1-Jun-2016
    • (2013)Fidelity, Soundness, and Efficiency of Interleaved Comparison MethodsACM Transactions on Information Systems10.1145/2536736.253673731:4(1-43)Online publication date: 1-Nov-2013
    • (2013)Pseudo test collections for training and tuning microblog rankersProceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval10.1145/2484028.2484063(53-62)Online publication date: 28-Jul-2013
    • (2012)Estimating interleaved comparison outcomes from historical click dataProceedings of the 21st ACM international conference on Information and knowledge management10.1145/2396761.2398516(1779-1783)Online publication date: 29-Oct-2012
    • (2012)On caption bias in interleaving experimentsProceedings of the 21st ACM international conference on Information and knowledge management10.1145/2396761.2396780(115-124)Online publication date: 29-Oct-2012
    • (2012)Content-Based Analysis Improves Audiovisual Archive RetrievalIEEE Transactions on Multimedia10.1109/TMM.2012.219356114:4(1166-1178)Online publication date: 1-Aug-2012
    • (2011)A probabilistic method for inferring preferences from clicksProceedings of the 20th ACM international conference on Information and knowledge management10.1145/2063576.2063618(249-258)Online publication date: 24-Oct-2011
    • (2010)Validating query simulatorsProceedings of the 2010 international conference on Multilingual and multimodal information access evaluation: cross-language evaluation forum10.5555/1889174.1889183(40-51)Online publication date: 20-Sep-2010
    • (2010)Today's and tomorrow's retrieval practice in the audiovisual archiveProceedings of the ACM International Conference on Image and Video Retrieval10.1145/1816041.1816045(18-25)Online publication date: 5-Jul-2010

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media