Entropy Calculator Formula

Understand the math behind the entropy calculator. Each variable explained with a worked example.

Formulas Used

Shannon Entropy (bits)

entropy = -(p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3))

Entropy (nats)

entropy_nats = -(p1 * log(p1) + p2 * log(p2) + p3 * log(p3))

Maximum Entropy (bits)

max_entropy = log2(3)

Entropy Ratio

efficiency = -(p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3)) / log2(3)

Variables

VariableDescriptionDefault
p1Probability 10.5
p2Probability 20.3
p3Probability 30.2

How It Works

How to Calculate Shannon Entropy

Formula

H = -Sum(pi * log2(pi))

Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.

Worked Example

Three outcomes with probabilities 0.5, 0.3, 0.2.

p1 = 0.5p2 = 0.3p3 = 0.2
  1. 01H = -(0.5*log2(0.5) + 0.3*log2(0.3) + 0.2*log2(0.2))
  2. 02= -(0.5*(-1) + 0.3*(-1.737) + 0.2*(-2.322))
  3. 03= -(-0.5 - 0.521 - 0.464)
  4. 04= 1.485 bits
  5. 05Max entropy = log2(3) = 1.585 bits
  6. 06Efficiency = 1.485 / 1.585 = 0.937

Ready to run the numbers?

Open Entropy Calculator