Entropy Calculator公式

## How to Calculate Shannon Entropy

### Formula

**H = -Sum(pi * log2(pi))**

Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.

计算示例

Three outcomes with probabilities 0.5, 0.3, 0.2.

  1. H = -(0.5*log2(0.5) + 0.3*log2(0.3) + 0.2*log2(0.2))
  2. = -(0.5*(-1) + 0.3*(-1.737) + 0.2*(-2.322))
  3. = -(-0.5 - 0.521 - 0.464)
  4. = 1.485 bits
  5. Max entropy = log2(3) = 1.585 bits
  6. Efficiency = 1.485 / 1.585 = 0.937