2

I am studying a dynamical system that takes as an initial condition a list. I want to analyze the evolution of Shannon's entropy in this system. I know the maximum entropy (50) and the minimum (0). Pure random conditions have almost maximum entropy, and so it is hard to analyze changes in it unless it decreases. I set up the list to have an initial value of 25 (average between maximum and minimum), so there is an equal amount to expand in either direction. Is this statistically sound?

Thanks in advance.

  • But what is the question precisely? Can you give more details? –  Jan 28 '11 at 08:02
  • What equation do you use to calculate the entropy? Like a "frequency" entropy? What constraints are the maximised entropy based on? agree with @mbq, this question needs more details. – probabilityislogic Jan 28 '11 at 16:13
  • I use this: http://upload.wikimedia.org/math/a/2/f/a2f05485301595188046d986c8cdd705.png. It is probability-based, as it compares the frequency of a state to the probability of a state. The maximum entropy is simply the highest possible entropy for the list, and occurs when the frequency for the two states is equal. The minimum occurs when it is all one state. The specific question is whether it would be "correct" to set the frequencies in the initial condition to produce an entropy that is an average between the minimum and maximum, to allow for equal change in entropy in either direction. –  Jan 28 '11 at 21:10
  • Hang on, to have maximum entropy of $50$, this means you have $n=e^{50}=gozillian$ categories? is that right? and how many times will you observe the system? – probabilityislogic Jan 29 '11 at 17:44
  • No, a list of length 100 with 2 states is enough. $50=-\frac{500.5\log{0.5}}{\log{2}}-\frac{500.5\log{0.5}}{\log{2}}$ –  Jan 29 '11 at 18:49
  • I would have thought that if a list can only have 2 states, then its maximum entropy (in nits) is $log(2)=0.693$ (or 1 bit). Or is it that each element in the list can have 2 states for a total of $2^{100}$ possible lists? This has maximum entropy of $100 log(2)$, which is not equal to $50$ (in bits or nits). Where does the $50$ come from in your equation, and why did you divide by $log(2)$? neither of the $50$ or the division by $log(2)$ is in the formula in your link. – probabilityislogic Jan 30 '11 at 02:47
  • Dividing by $\log{2}$ is simply for setting the base to 2, because $\log_b{x}=\frac{\log_e{x}}{\log_e{b}}$. As for multiplying by 50, I should have worded it better. There are two possible states for each element of a list for a list of length 100. So, assuming a frequency for each state is 50 (and a probability of 0.5), the formula simplifies to the one I gave above. Am I doing something wrong? Thanks in advance. –  Jan 30 '11 at 05:49
  • No, with 50 coin tosses you do get to 2^50≃10^15 possible configurations, so that's right. But what is the question? You said you set an initial value for the sequence you want to observe, but you didn't say why or how. – sesqu Mar 31 '11 at 19:11
  • If you have an evolution rule you want to study, you might have to look at the stationarity analytically, since you can't simulate with all one quadrillion starting conditions. That said, you can certainly try a few to possibly prove yourself wrong. Oh, and halfway to an entropy of 50 bits is an entropy of 49 bits, depending on what it is you're looking at. – sesqu Mar 31 '11 at 19:20

1 Answers1

1

The "expected entropy" at a particular time step given a particular starting point is a well-defined quantity and you could certainly study it. There is no particular reason why you should favor the "median" entropy without knowing anything else about the system. You should do experiments with as diverse of a set of starting configurations as possible to get a better understanding of your system.

  • Also recall that any measurement of entropy are estimates. Therefore, they have an underlying distribution if the processes that give rise to it are random! – Néstor Jul 24 '12 at 00:05