Calculateur d'Alpha de Cronbach

Calculez l'alpha de Cronbach pour évaluer la cohérence interne.

Cohen's Kappa

0.400

Observed Agreement70.0 %
Chance Agreement50.0 %
Total Observations100

Cohen's Kappa vs Rater 1 Yes, Rater 2 No

Formule

## How Cohen's Kappa Works Cohen's Kappa measures inter-rater agreement while correcting for chance. It is more robust than simple percent agreement. ### Formula **Kappa = (P_observed - P_expected) / (1 - P_expected)** ### Interpretation - < 0: Less than chance agreement - 0.01-0.20: Slight agreement - 0.21-0.40: Fair agreement - 0.41-0.60: Moderate agreement - 0.61-0.80: Substantial agreement - 0.81-1.00: Almost perfect agreement

Exemple Résolu

Two graders evaluate 100 essays: both say pass (40), both say fail (30), only Rater 1 passes (10), only Rater 2 passes (20).

  1. 01Total: 40 + 30 + 10 + 20 = 100
  2. 02P_observed = (40 + 30) / 100 = 0.70
  3. 03P(R1 yes) = 50/100 = 0.50, P(R2 yes) = 60/100 = 0.60
  4. 04P_expected = 0.50 x 0.60 + 0.50 x 0.40 = 0.30 + 0.20 = 0.50
  5. 05Kappa = (0.70 - 0.50) / (1 - 0.50) = 0.20 / 0.50 = 0.400

Questions Fréquentes

When should I use Kappa instead of percent agreement?

Always use Kappa when reporting inter-rater reliability in research. Percent agreement inflates reliability by ignoring chance.

What is a good Kappa value?

Values above 0.60 indicate substantial agreement. For high-stakes decisions, aim for 0.80 or higher.

Can Kappa be used with more than two raters?

Cohen's Kappa is designed for two raters. For multiple raters, use Fleiss' Kappa instead.

Calculatrices Associées