Cohen's kappa coefficient
Description
Cohen’s kappa coefficient is a statistical measure of inter-rater agreement or inter-annotator agreement for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into account the agreement occurring by chance. Cohen’s kappa measures the agreement between two raters who each classify N items into C mutually exclusive categories.
Related formulasVariables
κ | Cohen's kappa coefficient (dimensionless) |
Pra | The relative observed agreement among raters (dimensionless) |
Pre | The hypothetical probability of chance agreement (dimensionless) |