Calculadora de Alfa de Cronbach
Calcule o coeficiente alfa de Cronbach para avaliar a confiabilidade de instrumentos.
Cohen's Kappa
0.400
Cohen's Kappa vs Rater 1 Yes, Rater 2 No
Formula
## How Cohen's Kappa Works Cohen's Kappa measures inter-rater agreement while correcting for chance. It is more robust than simple percent agreement. ### Formula **Kappa = (P_observed - P_expected) / (1 - P_expected)** ### Interpretation - < 0: Less than chance agreement - 0.01-0.20: Slight agreement - 0.21-0.40: Fair agreement - 0.41-0.60: Moderate agreement - 0.61-0.80: Substantial agreement - 0.81-1.00: Almost perfect agreement
Exemplo Resolvido
Two graders evaluate 100 essays: both say pass (40), both say fail (30), only Rater 1 passes (10), only Rater 2 passes (20).
- 01Total: 40 + 30 + 10 + 20 = 100
- 02P_observed = (40 + 30) / 100 = 0.70
- 03P(R1 yes) = 50/100 = 0.50, P(R2 yes) = 60/100 = 0.60
- 04P_expected = 0.50 x 0.60 + 0.50 x 0.40 = 0.30 + 0.20 = 0.50
- 05Kappa = (0.70 - 0.50) / (1 - 0.50) = 0.20 / 0.50 = 0.400
Perguntas Frequentes
When should I use Kappa instead of percent agreement?
Always use Kappa when reporting inter-rater reliability in research. Percent agreement inflates reliability by ignoring chance.
What is a good Kappa value?
Values above 0.60 indicate substantial agreement. For high-stakes decisions, aim for 0.80 or higher.
Can Kappa be used with more than two raters?
Cohen's Kappa is designed for two raters. For multiple raters, use Fleiss' Kappa instead.