0

Suppose, a sample of $Y=(1/\sqrt{N}) ∑_{i=1,\dotsc,N} X_i$, where $X\sim \mathcal{Uniform Distribution}(-3, 3)$. We have let's say 10,000 such samples of Y.

Here, when we increase the value of N, why does the value of entropy of $Y$ increase? Is it because there are more elements of uncertainty in the value of $Y$? What I mean is: when $N=3$, there are 3 uncertain $X_i$ values making up $Y$. When $N=10$, there are 10 uncertain (random) values of $X$ making up $Y$. Is it because of this reason, entropy of $Y$ increases when $N$ increases?

Curious
  • 421
  • 1
    Could you explain what you mean by "3 uncertain X values making up Y"? That doesn't seem remotely consistent with what you have described. – whuber Oct 26 '22 at 18:50
  • I assume you want the $X_i$ to be independent? Please state so ... and do other clarifications asked for ... Pending those, contemplate that by the Central Limit Theorem, your $Y$ will tend to a normal distribution with increasing $N$ (while its mean and variance do not depend on $N$), and that the normal distribution maximizes entropy. See also https://stats.stackexchange.com/questions/210669/a-dynamical-systems-view-of-the-central-limit-theorem/210682#210682 – kjetil b halvorsen Oct 26 '22 at 19:09
  • @whuber Since Y is calculated using certain N number of X values each of which is a random sample from Uniform Distribution, my assumption is that we do not know what that random sample is going to be, we just know the range. Hence there is some uncertainty there. Please correct me if my assumption is incorrect and it would be really helpful if you could point me to the correct direction. – Curious Oct 26 '22 at 20:22
  • The range of $Y$ is from $-3\sqrt N$ to $+3\sqrt N.$ But that's of little use. You appear to be asking about the differential entropy of a continuous distribution. It will be extremely well approximated by the differential entropy of the Normal distribution with the same mean and variance. A formula for that is at https://stats.stackexchange.com/questions/415435. Because the variance of $Y$ is $3$ and its mean is $0,$ the answer will be extremely close to $\frac{1}{2}\log(2\pi) + \frac{1}{2} + \log 3.$ – whuber Oct 26 '22 at 20:28
  • @kjetilbhalvorsen thank you for your response. I am not being able to understand how normal distribution maximizes entropy or saturates the entropy to some maximum value. Could you please help me understand this? – Curious Oct 26 '22 at 21:59
  • Maxent property of normal distribution: https://stats.stackexchange.com/questions/194051/prove-that-the-maximum-entropy-distribution-with-a-fixed-covariance-matrix-is-a, https://en.wikipedia.org/wiki/Normal_distribution#Maximum_entropy – kjetil b halvorsen Mar 05 '24 at 18:21

0 Answers0