The previous talk in the series is here. The next talk in the series is here.

This follows on from the previous talk, where I introduced the axioms for entropy. Here I talk about some further simple properties of entropy and how they follow from the axioms, using the continuity and maximization axioms for the first time. We recap the axioms, then prove that the entropy of a constant random variable is zero. We then show that composing a random variable doesn’t increase its entropy, that the entropy of the uniform distribution on a set increases with the size of the set, and in fact it strictly increases. Finally, we note that the entropy of a non-constant random variable is strictly positive.

This video was produced by Tim Gowers as part of his Part III course at the University of Cambridge. Printed notes for this course are available here.