I'm struggling with a form of viewing Shannon Entropy. Cover & Thomas say that entropy is the expected value of information content. So there is a random variable $X$ with distribution $P(X)$ and its entropy would be given by $H(X) = \underset{x\in X}{\sum}p(x)\log_2\frac{1}{p(x)}$.
My problem is when calculating entropy of a series of outcomes of $X$. If these outcomes are $x_1, x_2\in X$, is it appropriate to calculate the entropy of this set as $H(x_1, x_2) = \frac{1}{2}\log_2\frac{1}{p(x_1)} + \frac{1}{2}\log_2\frac{1}{p(x_2)}$? If this is not Shannon Entropy, then is it another known measurement? Also, how can I calculate the entropy of such set?
Edit: I also think it is not Shannon Entropy, but I'm struggling with the fact that within a set of occurrences probabilities will change, and how it affects entropy is still not very clear to me.
– blandre Apr 03 '19 at 18:45