The distribution of a uniform r.v. X is given as follows:
The entropy is therefore:
This means that as $∆$ approaches infinity, so does the entropy. This also means that as $∆$ approaches 0, the entropy approaches negative infinity.
A couple questions about the interpretation of this entropy:
- Does infinite entropy mean that we need infinite information to determine/represent the random variable?
- What does it mean to have negative entropy? What does an entropy of negative infinity mean? I thought that an entropy of zero meant that we need no information to determine/represent a distribution. Therefore, negative entropy does not make sense to me.

