Inter-Rater Reliability Calculator

Calculate Cohen's Kappa to measure the agreement between two raters beyond what would be expected by chance.

Cohen's Kappa

0.400

Observed Agreement70.0 %
Chance Agreement50.0 %
Total Observations100

Cohen's Kappa vs Rater 1 Yes, Rater 2 No

How Cohen's Kappa Works

Cohen's Kappa measures inter-rater agreement while correcting for chance. It is more robust than simple percent agreement.

Formula

Kappa = (P_observed - P_expected) / (1 - P_expected)

Interpretation

  • < 0: Less than chance agreement
  • 0.01-0.20: Slight agreement
  • 0.21-0.40: Fair agreement
  • 0.41-0.60: Moderate agreement
  • 0.61-0.80: Substantial agreement
  • 0.81-1.00: Almost perfect agreement
  • Example Calculation

    Two graders evaluate 100 essays: both say pass (40), both say fail (30), only Rater 1 passes (10), only Rater 2 passes (20).

    1. 01Total: 40 + 30 + 10 + 20 = 100
    2. 02P_observed = (40 + 30) / 100 = 0.70
    3. 03P(R1 yes) = 50/100 = 0.50, P(R2 yes) = 60/100 = 0.60
    4. 04P_expected = 0.50 x 0.60 + 0.50 x 0.40 = 0.30 + 0.20 = 0.50
    5. 05Kappa = (0.70 - 0.50) / (1 - 0.50) = 0.20 / 0.50 = 0.400

    Frequently Asked Questions

    Related Calculators