I am trying to measure the uniformity of multimodal distributions and am looking into using entropy.
I would like a measure of entropy that is higher for the first distribution than the second.
However, Shannon entropy is order independent (invariant under permutations) so the above distributions have the same Shannon entropy.
Is there anyway to capture clumping in the measure of entropy?
My thinking is that under the heat equation, it should take longer for the second distribution's Shannon entropy to become asymptotic so it should have lower entropy.
Let there be 6 bins evenly filled out of 12. Shannon entropy doesn't distinguish between the case A where all even bins are filled vs case B where bins [0,3) and [9, 12) are filled
| index | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| A | 0 | 1/6 | 0 | 1/6 | 0 | 1/6 | 0 | 1/6 | 0 | 1/6 | 0 | 1/6 |
| B | 1/6 | 1/6 | 1/6 | 0 | 0 | 0 | 0 | 0 | 0 | 1/6 | 1/6 | 1/6 |
Right--but so what? The indexes of the bins are irrelevant. If they are relevant to you, then you do not need entropy; you need another concept. But in that case, what are you trying to characterize? @whuber
I though this might have been studied and you could tell me?