Inter-Rater Reliability Calculator
Calculate Cohen's Kappa to measure the agreement between two raters beyond what would be expected by chance.
Cohen's Kappa
0.400
Observed Agreement70.0 %
Chance Agreement50.0 %
Total Observations100
Cohen's Kappa vs Rater 1 Yes, Rater 2 No
How Cohen's Kappa Works
Cohen's Kappa measures inter-rater agreement while correcting for chance. It is more robust than simple percent agreement.
Formula
Kappa = (P_observed - P_expected) / (1 - P_expected)
Interpretation
Example Calculation
Two graders evaluate 100 essays: both say pass (40), both say fail (30), only Rater 1 passes (10), only Rater 2 passes (20).
- 01Total: 40 + 30 + 10 + 20 = 100
- 02P_observed = (40 + 30) / 100 = 0.70
- 03P(R1 yes) = 50/100 = 0.50, P(R2 yes) = 60/100 = 0.60
- 04P_expected = 0.50 x 0.60 + 0.50 x 0.40 = 0.30 + 0.20 = 0.50
- 05Kappa = (0.70 - 0.50) / (1 - 0.50) = 0.20 / 0.50 = 0.400