Principled Multi-Aspect Evaluation Measures of Rankings

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review


  • Fulltext

    Final published version, 1.4 MB, PDF document

Information Retrieval evaluation has traditionally focused on defining principled ways of assessing the relevance of a ranked list of documents with respect to a query. Several methods extend this type of evaluation beyond relevance, making it possible to evaluate different aspects of a document ranking (e.g., relevance, usefulness, or credibility) using a single measure (multi-aspect evaluation). However, these methods either are (i) tailor-made for specific aspects and do not extend to other types or numbers of aspects, or (ii) have theoretical anomalies, e.g. assign maximum score to a ranking where all documents are labelled with the lowest grade with respect to all aspects (e.g., not relevant, not credible, etc.). We present a theoretically principled multi-aspect evaluation method that can be used for any number, and any type, of aspects. A thorough empirical evaluation using up to 5 aspects and a total of 425 runs officially submitted to 10 TREC tracks shows that our method is more discriminative than the state-of-the-art and overcomes theoretical limitations of the state-of-the-art.

Original languageEnglish
Title of host publicationCIKM 2021 - Proceedings of the 30th ACM International Conference on Information and Knowledge Management
PublisherAssociation for Computing Machinery, Inc
Publication date2021
ISBN (Electronic)9781450384469
Publication statusPublished - 2021
Event30th ACM International Conference on Information and Knowledge Management, CIKM 2021 - Virtual, Online, Australia
Duration: 1 Nov 20215 Nov 2021


Conference30th ACM International Conference on Information and Knowledge Management, CIKM 2021
ByVirtual, Online

Bibliographical note

Publisher Copyright:
© 2021 Owner/Author.

    Research areas

  • evaluation, multiple aspects, partial order, ranking

Number of downloads are based on statistics from Google Scholar and

No data available

ID: 300918675