Entropy Calculator Formula

Understand the math behind the entropy calculator. Each variable explained with a worked example.

Formulas Used

Shannon Entropy (bits)

entropy = -(p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3))

Entropy (nats)

entropy_nats = -(p1 * log(p1) + p2 * log(p2) + p3 * log(p3))

Maximum Entropy (bits)

max_entropy = log2(3)

Entropy Ratio

efficiency = -(p1 * log2(p1) + p2 * log2(p2) + p3 * log2(p3)) / log2(3)

Variables

VariableDescriptionDefault
p1Probability 10.5
p2Probability 20.3
p3Probability 30.2

How It Works

How to Calculate Shannon Entropy

Formula

H = -Sum(pi * log2(pi))

Shannon entropy measures the average information content (in bits) per outcome. Maximum entropy occurs when all outcomes are equally likely (uniform distribution). Entropy is zero when the outcome is certain (one probability = 1). Higher entropy means more uncertainty.

Worked Example

Three outcomes with probabilities 0.5, 0.3, 0.2.

p1 = 0.5p2 = 0.3p3 = 0.2
  1. 01H = -(0.5*log2(0.5) + 0.3*log2(0.3) + 0.2*log2(0.2))
  2. 02= -(0.5*(-1) + 0.3*(-1.737) + 0.2*(-2.322))
  3. 03= -(-0.5 - 0.521 - 0.464)
  4. 04= 1.485 bits
  5. 05Max entropy = log2(3) = 1.585 bits
  6. 06Efficiency = 1.485 / 1.585 = 0.937

Frequently Asked Questions

What does entropy measure intuitively?

Entropy quantifies surprise or unpredictability. A fair coin (H=1 bit) is maximally uncertain for two outcomes. A loaded coin (H<1) is more predictable. Entropy is the average number of yes/no questions needed to determine the outcome.

Why use log base 2 vs. natural log?

Log base 2 gives entropy in bits (binary digits), which is natural for information theory and computing. Natural log gives entropy in nats. They differ by a constant factor: 1 bit = ln(2) nats ≈ 0.693 nats.

How is entropy used in machine learning?

Decision trees use information gain (entropy reduction) to choose the best split. Cross-entropy loss is a standard loss function for classification. KL divergence (relative entropy) measures how one distribution differs from another.

Ready to run the numbers?

Open Entropy Calculator