Reliability Assessment of a Measure: The Kappa Statistic
DOI:
https://doi.org/10.33393/gcnd.2016.738Keywords:
Cohen's Kappa, Inter-rater agreement, Intra-rater agreement, Reliability, Weighted KappaAbstract
The Kappa coefficient is a measure of inter-rater agreement. This statistic measures observed agreement greater than chance and can range from -1 to 1. A value of zero indicates statistical independence and a value of 1 indicates a perfect agreement between observers. The value of Kappa is influenced by the prevalence of the evaluated condition: two observers can have a high observed agreement but low Kappa if the prevalence is very high or very low (paradox of the Kappa statistic). (Epidemiology_statistics)