Calculadora de Entropía — Fórmula
## How to Calculate Shannon Entropy
### Formula
**H = -Sum(pi * log2(pi))**
Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.
### Formula
**H = -Sum(pi * log2(pi))**
Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.
Ejemplo Resuelto
Three outcomes with probabilities 0.5, 0.3, 0.2.
- H = -(0.5*log2(0.5) + 0.3*log2(0.3) + 0.2*log2(0.2))
- = -(0.5*(-1) + 0.3*(-1.737) + 0.2*(-2.322))
- = -(-0.5 - 0.521 - 0.464)
- = 1.485 bits
- Max entropy = log2(3) = 1.585 bits
- Efficiency = 1.485 / 1.585 = 0.937