Entropy Calculator Formula
Understand the math behind the entropy calculator. Each variable explained with a worked example.
Formulas Used
Shannon Entropy (bits)
entropy = -(p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3))Entropy (nats)
entropy_nats = -(p1 * log(p1) + p2 * log(p2) + p3 * log(p3))Maximum Entropy (bits)
max_entropy = log2(3)Entropy Ratio
efficiency = -(p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3)) / log2(3)Variables
| Variable | Description | Default |
|---|---|---|
p1 | Probability 1 | 0.5 |
p2 | Probability 2 | 0.3 |
p3 | Probability 3 | 0.2 |
How It Works
How to Calculate Shannon Entropy
Formula
H = -Sum(pi * log2(pi))
Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.
Worked Example
Three outcomes with probabilities 0.5, 0.3, 0.2.
p1 = 0.5p2 = 0.3p3 = 0.2
- 01H = -(0.5*log2(0.5) + 0.3*log2(0.3) + 0.2*log2(0.2))
- 02= -(0.5*(-1) + 0.3*(-1.737) + 0.2*(-2.322))
- 03= -(-0.5 - 0.521 - 0.464)
- 04= 1.485 bits
- 05Max entropy = log2(3) = 1.585 bits
- 06Efficiency = 1.485 / 1.585 = 0.937
Ready to run the numbers?
Open Entropy Calculator