Loading…

The Kappa Statistic: A Second Look

In recent years, the kappa coefficient of agreement has become the de facto standard for evaluating intercoder agreement for tagging tasks. In this squib, we highlight issues that affect κ and that the community has largely neglected. First, we discuss the assumptions underlying different computatio...

Full description

Saved in:
Bibliographic Details
Published in:Computational linguistics - Association for Computational Linguistics 2004-03, Vol.30 (1), p.95-101
Main Authors: Eugenio, Barbara Di, Glass, Michael
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In recent years, the kappa coefficient of agreement has become the de facto standard for evaluating intercoder agreement for tagging tasks. In this squib, we highlight issues that affect κ and that the community has largely neglected. First, we discuss the assumptions underlying different computations of the expected agreement component of κ. Second, we discuss how prevalence and bias affect the κ measure.
ISSN:0891-2017
1530-9312
DOI:10.1162/089120104773633402