If one deals with the information content of news, one comes across the so-called entropy again and again on the internet. When this is explored further, it is often referred to as a measure of the information content or uncertainty of a message. I refer here to Shannon's entropy definition, which is summarised as follows according to the german Wikipedia:
Claude Elwood Shannon defined the entropy H of a discrete memoryless source (discrete random variable) $X$ over a finite alphabet consisting of characters $Z=\left\{z_{1}, z_{2}, \ldots, z_{m}\right\}$ as follows: First, one assigns to each probability $p$ of an event its information content $I(z)=-\log _{2} p_{z}$. Then the entropy of a sign is defined as the expected value of the information content $$ \mathrm{H}_{1}=E[I]=\sum_{z \in Z} p_{z} I(z)=-\sum_{z \in Z} p_{z} \log _{2} p_{z} $$
For me, the notion of uncertainty, which is also mentioned in part in this context, is problematic because I find it hard to imagine that the notion of uncertainty, which usually stems from a lack of information, and entropy, as a measure of information content, go together. How can one imagine this in an intuitive way?