Free Entropy Calculator

Calculate Shannon entropy, which measures the uncertainty or information content of a probability distribution.

Shannon Entropy (bits)

1.485475

Entropy (nats)1.029653
Maximum Entropy (bits)1.584963
Entropy Ratio0.937231

Shannon Entropy (bits) vs Probability 1

How to Calculate Shannon Entropy

Formula

H = -Sum(pi * log2(pi))

Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.

Example Calculation

Three outcomes with probabilities 0.5, 0.3, 0.2.

  1. 01H = -(0.5*log2(0.5) + 0.3*log2(0.3) + 0.2*log2(0.2))
  2. 02= -(0.5*(-1) + 0.3*(-1.737) + 0.2*(-2.322))
  3. 03= -(-0.5 - 0.521 - 0.464)
  4. 04= 1.485 bits
  5. 05Max entropy = log2(3) = 1.585 bits
  6. 06Efficiency = 1.485 / 1.585 = 0.937

Frequently Asked Questions

Learn More

Understanding the Normal Distribution

Learn what the normal distribution is, why it matters in statistics, and how to use the bell curve for probability calculations, z-scores, and real-world data analysis.

Related Calculators