9

Conditional entropy $H(X|Y)$ tells us how much the average uncertainty about a channel input $X$ is after observing channel output $Y$, and mutual information $I(X,Y)$ measures how much information about channel input $X$ can be obtained observing channel output $Y$.

Doesn't this mean that $H(X|Y)$ and $I(X,Y)$ complement each other or tell us the same thing? If so, then since mutual information is symmetric $I(X,Y)=I(Y,X)$, shouldn't conditional entropy be symmetric as well?

penelope
  • 3,676
  • 1
  • 26
  • 63
Dina Khaled
  • 91
  • 1
  • 2
  • 2
    Consider something more basic. Shouldn't the conditional probability $P(A\mid B)$ have the same value as the conditional probability $P(B\mid A)$? After all, they are dependent on $P(A\cap B)$ and $P(B\cap A)$ respectively and we know that $P(A\cap B) = P(B\cap A) ~ \ldots$ – Dilip Sarwate Jan 17 '14 at 15:02

2 Answers2

6

No,because while mutual information tells you how much uncertainty can a random variable $X$ remove from another random variable $Y$, the conditional entropy of $X$ given $Y$ tells you how much uncertainty remained in X after using the information (given by $I(X;Y)$) that $Y$ gave it.

$H(X|Y)=H(X)-I(X;Y)$

That's why the conditional entropy depends on the value of the entropy before the observation and the mutual information isn't, because it is only the difference ($\delta$) between two entropy states, before and after the observation.

lennon310
  • 3,590
  • 19
  • 24
  • 27
Gabizon
  • 237
  • 1
  • 9
2

There's a cool Venn diagram from here. This shows clearly that $I(X:Y)$ is independent of the order of $X$ and $Y$.

enter image description here

Peter K.
  • 25,714
  • 9
  • 46
  • 91
Hugh Perkins
  • 121
  • 3