Cohen's Kappa Calculator

Enter the agreement table for two raters (same categories on rows and columns). The tool will compute observed agreement, expected agreement, Cohen’s kappa, approximate standard error, 95% confidence interval, and an interpretation based on Landis & Koch.

2–6 categories

Cohen's κ

Observed agreement (Po)

Expected agreement (Pe)

Interpretation

Formula

\( P_o = \frac{\sum_i n_{ii}}{N} \)

\( P_e = \frac{\sum_i (n_{i+} \cdot n_{+i})}{N^2} \)

\( \kappa = \frac{P_o - P_e}{1 - P_e} \)

where \(n_{ii}\) are diagonal (agreements), \(n_{i+}\) are row totals, \(n_{+i}\) are column totals, \(N\) is total.

Landis & Koch (1977) scale (rule of thumb)

  • < 0: poor
  • 0.00–0.20: slight
  • 0.21–0.40: fair
  • 0.41–0.60: moderate
  • 0.61–0.80: substantial
  • 0.81–1.00: almost perfect

This is just guidance; report kappa, Po, Pe and sample size.