0

The distribution of a uniform r.v. X is given as follows:

enter image description here

The entropy is therefore:

enter image description here

This means that as $∆$ approaches infinity, so does the entropy. This also means that as $∆$ approaches 0, the entropy approaches negative infinity.

A couple questions about the interpretation of this entropy:

  1. Does infinite entropy mean that we need infinite information to determine/represent the random variable?
  2. What does it mean to have negative entropy? What does an entropy of negative infinity mean? I thought that an entropy of zero meant that we need no information to determine/represent a distribution. Therefore, negative entropy does not make sense to me.
Tim K
  • 5
  • 1
    Which distribution(s) do you suppose "need no information" for their representation?? – whuber Jan 04 '23 at 00:48
  • @whuber I was thinking of a discrete rv that takes on only one value in some alphabet or a continuous rv that has as its pdf the Dirac delta function. Just any rv that can take only one value in its alphabet. – Tim K Jan 04 '23 at 00:53
  • 2
    The answer to https://math.stackexchange.com/questions/1156404/entropy-of-a-uniform-distribution should answer your questions. – jbowman Jan 04 '23 at 01:04
  • 1
    Even when the distribution is that of a constant, you still need to specify which value it takes on. But trying to compare this differential entropy with the discrete entropy is problematic: the interpretations do not directly translate from one to the other. – whuber Jan 04 '23 at 15:15
  • 1
    For negative differential entropy see https://stats.stackexchange.com/questions/73881/when-is-the-differential-entropy-negative – kjetil b halvorsen Jan 10 '23 at 18:46

0 Answers0