Cohen's Kappa Calculator

Enter the agreement table for two raters (same categories on rows and columns). The tool will compute observed agreement, expected agreement, Cohen’s kappa, approximate standard error, 95% confidence interval, and an interpretation based on Landis & Koch.

2–6 categories

Cohen's κ

Observed agreement (Po)

Expected agreement (Pe)

Interpretation

Formula

\( P_o = \frac{\sum_i n_{ii}}{N} \)

\( P_e = \frac{\sum_i (n_{i+} \cdot n_{+i})}{N^2} \)

\( \kappa = \frac{P_o - P_e}{1 - P_e} \)

where \(n_{ii}\) are diagonal (agreements), \(n_{i+}\) are row totals, \(n_{+i}\) are column totals, \(N\) is total.

Landis & Koch (1977) scale (rule of thumb)

  • < 0: poor
  • 0.00–0.20: slight
  • 0.21–0.40: fair
  • 0.41–0.60: moderate
  • 0.61–0.80: substantial
  • 0.81–1.00: almost perfect

This is just guidance; report kappa, Po, Pe and sample size.


Audit: Complete
Formula (LaTeX) + variables + units
This section shows the formulas used by the calculator engine, plus variable definitions and units.
Formula (extracted LaTeX)
\[','\\]
','\
Formula (extracted text)
\( P_o = \frac{\sum_i n_{ii}}{N} \) \( P_e = \frac{\sum_i (n_{i+} \cdot n_{+i})}{N^2} \) \( \kappa = \frac{P_o - P_e}{1 - P_e} \) where \(n_{ii}\) are diagonal (agreements), \(n_{i+}\) are row totals, \(n_{+i}\) are column totals, \(N\) is total.
Variables and units
  • No variables provided in audit spec.
Sources (authoritative):
Changelog
Version: 0.1.0-draft
Last code update: 2026-01-19
0.1.0-draft · 2026-01-19
  • Initial audit spec draft generated from HTML extraction (review required).
  • Verify formulas match the calculator engine and convert any text-only formulas to LaTeX.
  • Confirm sources are authoritative and relevant to the calculator methodology.
Verified by Ugo Candido on 2026-01-19
Profile · LinkedIn
, ', svg: { fontCache: 'global' } }; ]], displayMath: [['\\[','\\]']] }, svg: { fontCache: 'global' } };, svg: { fontCache: 'global' } };