Fleiss kappa for doc-2-doc relevance assessment

Example dataset for Signposting tutorial

About this tutorial

One of the results of the web-app TREC-doc-2-doc-relevance is the measure of the inter-annotator agreement on the task of document-to-document relevance assessment using Fleiss' kappa.

This page has been created to emulate the Zenodo Landing page for Fleiss kappa for doc-2-doc relevance assessment dataset so that we can add Signposting as part of the signposting-tutorial. For the purpose of the tutorial, imagine that the DOI below is registered to redirect to this page, rather than the real Zenodo entry.

Abstract

Here we present a table summarizing the Fleiss’ kappa results. The Fleiss’ Kappa was calculated as a way to measure the degree of agreement between four annotators who evaluated the relevance of a set of documents (15 evaluation articles) regarding its corresponding “reference article”. The table contains 7 columns, the first one presents the topics, 8 in total. The second column shows the “reference articles”, represented by their PubMed-ID and organized by topic. The third column shows the Fleiss’ Kappa results. The fourth column shows the interpretation of the Fleiss' Kappa results being: i) “Poor” results <0.20, ii) “Fair” results within 0.21 - 0.40, and iii) “Moderate” results within 0.41 - 0.60. The fifth column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Relevant” regarding its corresponding “reference article”. The sixth column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Partially relevant” regarding its corresponding “reference article”. The seventh column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Non-relevant” regarding its corresponding “reference article”.

Metadata

Title
Fleiss kappa for doc-2-doc relevance assessment
Type
Data set
DOI
10.5281/zenodo.7338056
Keywords
Fleiss kappa
Inter-annotator agreement
Authors
Olga Giraldo
Dhwani Solanki
Dietrich Rebholz-Schuhmann
Leyla Jael Castro
License
Creative Commons Attribution 4.0 International
Published
November 19, 2022

Download

Acknowledgements

This work is part of the STELLA project funded by DFG (project no. 407518790). This work was supported by the BMBF-funded de.NBI Cloud within the German Network for Bioinformatics Infrastructure (de.NBI) (031A532B, 031A533A, 031A533B, 031A534A, 031A535A, 031A537A, 031A537B, 031A537C, 031A537D, 031A538A)

Export

Metadata can be downloaded as: Bioschemas JSON-LD, schema.org JSON-LD, BibTeX, Datacite XML, Dublin Core XML