Calcolatore Alpha di Cronbach — Formula
## How Cohen's Kappa Works
Cohen's Kappa measures inter-rater agreement while correcting for chance. It is more robust than simple percent agreement.
### Formula
**Kappa = (P_observed - P_expected) / (1 - P_expected)**
### Interpretation
- < 0: Less than chance agreement
- 0.01-0.20: Slight agreement
- 0.21-0.40: Fair agreement
- 0.41-0.60: Moderate agreement
- 0.61-0.80: Substantial agreement
- 0.81-1.00: Almost perfect agreement
Cohen's Kappa measures inter-rater agreement while correcting for chance. It is more robust than simple percent agreement.
### Formula
**Kappa = (P_observed - P_expected) / (1 - P_expected)**
### Interpretation
- < 0: Less than chance agreement
- 0.01-0.20: Slight agreement
- 0.21-0.40: Fair agreement
- 0.41-0.60: Moderate agreement
- 0.61-0.80: Substantial agreement
- 0.81-1.00: Almost perfect agreement
Esempio Risolto
Two graders evaluate 100 essays: both say pass (40), both say fail (30), only Rater 1 passes (10), only Rater 2 passes (20).
- Total: 40 + 30 + 10 + 20 = 100
- P_observed = (40 + 30) / 100 = 0.70
- P(R1 yes) = 50/100 = 0.50, P(R2 yes) = 60/100 = 0.60
- P_expected = 0.50 x 0.60 + 0.50 x 0.40 = 0.30 + 0.20 = 0.50
- Kappa = (0.70 - 0.50) / (1 - 0.50) = 0.20 / 0.50 = 0.400