Reliability Assessment of a Measure: The Kappa Statistic
DOI:
https://doi.org/10.33393/gcnd.2016.738Keywords:
Cohen's Kappa, Inter-rater agreement, Intra-rater agreement, Reliability, Weighted KappaAbstract
The Kappa coefficient is a measure of inter-rater agreement. This statistic measures observed agreement greater than chance and can range from -1 to 1. A value of zero indicates statistical independence and a value of 1 indicates a perfect agreement between observers. The value of Kappa is influenced by the prevalence of the evaluated condition: two observers can have a high observed agreement but low Kappa if the prevalence is very high or very low (paradox of the Kappa statistic). (Epidemiology_statistics)
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Authors contributing to Giornale di Clinica Nefrologica e Dialisi (GCND) agree to publish their articles under the CC-BY-NC 4.0 license, which allows third parties to re-use the work without permission as long as the work is properly referenced and the use is non-commercial.