In this section we talk about entropy, a concept of fundamental importance in physics, mathematics and information theory. 1 This section was not covered in class and may be safely omitted
Entropy is a measure of the randomness. For example, among the $\mbox{Ber}(p)$ distributions, the entropy is maximized at $p=1/2$ and minimized at $p=0\mbox{ or }1$. It quantifies the intuitive feeling that $\mbox{Ber}(1/2)$ is more random than $\mbox{Ber}(1/4)$.
Gibbs measures : Let ${\mathcal A}$ be a countable set and let ${\mathcal H}:{\mathcal A}\rightarrow \mathbb{R}$ be a given function. For any $E\in \mathbb{R}$, consider the set of ${\mathcal P}_{E}$ of all probability mass functions on $\Omega$ under which ${\mathcal H}$ has expected value $E$. In other words, $$ {\mathcal P}_{E}:=\{{\bf p}=\left(p_{i}\right)_{i\in {\mathcal A}}{\; : \;} \sum_{i\in {\mathcal A}} p(i){\mathcal H}(i)=E\}. $$ ${\mathcal P}_{E}$ is non-empty if and only if ${\mathcal H}_{\min}\le E\le {\mathcal H}_{\max}$.