According to the Wiki, Jensen–Shannon divergence (JSD) is bounded between [0,1]. I am having trouble understanding why this is.
Let's say $p_1 ~ N(\mu_1,\sigma^2)$, and $p_2 ~ N(\mu_2,\sigma^2)$ (same standard deviation). Then $JSD(p_1||p_2) = \frac{1}{4}((\frac{\mu_1-(\frac{\mu_1+\mu_2}{2})^2}{\sigma^2})+(\frac{\mu_2-(\frac{\mu_1+\mu_2}{2})^2}{\sigma^2}))$. I would expect that in the limit as $\sigma \rightarrow 0, JSD \rightarrow \infty$.
However, while the above is a problem for Kullback–Leibler divergence (KLD), it is not a problem for JSD according to this post. So, what am I doing wrong? How is JSD bound even in the above case?
Edit:
I got to this JSD formula by the following. $\mu_{avg} = \frac{1}{2}(\mu_1+\mu_2)$. I assumed that the mixture distribution would have the same standard deviation $\sigma$ (maybe that is wrong).
By using the KLD formula here. With means $\mu_1$, $\mu_{avg}$ and shared standard deviation $\sigma$, we get KLD1 = $\frac{(\mu_1-\mu_{avg})^2}{2\sigma^2}$, KLD2 = $\frac{(\mu_2-\mu_{avg})^2}{2\sigma^2}$.
Then I used JSD = $\frac{1}{2}KLD1 + \frac{1}{2}KLD2$, which gave me my formula above. I take it that somewhere in here I went wrong?