3

I've been trying to get the KL divergence for two lognormal distributions. I know what it is for the univariate case,

$$ D(f_i\|f_j)= \frac1{2\sigma_j^2}\left[(\mu_i-\mu_j)^2+\sigma_i^2-\sigma_j^2\right] + \ln \frac{\sigma_j}{\sigma_i}, $$

however I've not been able to find anything on the multivariate case.

as646
  • 65

1 Answers1

1

Assume $p(x)= \log\mathcal{N}(x|\mu_p,\Sigma_p)$ and $q(x)= \log\mathcal{N}(x|\mu_q,\Sigma_q)$. Where $x\in R^D$ and $D$ is the dimension of $x$. So $x = [x_1,x_2, \ldots ,x_D]^T.$

Based on KL-divergence definition we have: $$D_{KL}(p(x)||q(x))=\int\limits_{R^D} p(x)\log\dfrac{p(x)}{q(x)}=\int\limits_{R^D}p(x)\log (p(x)) - \int\limits_{R^D}p(x)\log(q(x))=E_p\{log(p(x))\} - E_p\{log(q(x))\}.$$

Before taking the first expectation, let's first determine the $\log(p(x))$: $$log(p(x))=-\frac{D}{2}\log(2\pi)-0.5\log(|\Sigma_p|)-\sum\limits_{i=1}^Dlog(x_i) - 0.5(\log(x)-\mu_p)^T\Sigma_p^{-1}(\log(x)-\mu_p).$$

The important thing to remember is that if for example $z\sim\log\mathcal{N}(\mu,\Sigma)$ then $\log(z)\sim\mathcal{N}(\mu,\Sigma)$.

Now let's take the expectation of $\log(p(x))$: $$E_p\{log(p(x))\}=-\frac{D}{2}\log(2\pi)-0.5\log(|\Sigma_p|)-\sum\limits_{i=1}^D\mu_p(i) - 0.5*trace(I_D).$$

Similarly for $E_p\{log(q(x))\}$: $$E_p\{log(q(x))\}=-\frac{D}{2}\log(2\pi)-0.5\log(|\Sigma_q|)-\sum\limits_{i=1}^D\mu_p(i)-0.5*trace(\Sigma_p\Sigma_q^{-1})-0.5(\mu_p-\mu_q)^T\Sigma_q^{-1}(\mu_p-\mu_q).$$

So for $D_{KL}(p||q)$ :

$$D_{KL}(p||q)=0.5\log(\dfrac{|\Sigma_q|}{|\Sigma_p}|)-D/2 + 0.5 trace(\Sigma_p\Sigma_q^{-1})+0.5(\mu_p-\mu_q)^T\Sigma_q^{-1}(\mu_p-\mu_q).$$

you can see that the KL divergence between two lognormals is equal to the Kl divergence between two normal distributions with the same parameter $\mu$ and $\Sigma$.

nomadr
  • 36
  • There is an error here. If z distributed log normal then it is exp(z) that is normal and not log(z). – Michael R. Chernick Dec 28 '17 at 02:12
  • @Michael Chernick I haven't checked the correctness of the answer, but you are the one who's mixed up. The log of a lognormal is normal. The exp of a normal is lognormal. – Mark L. Stone Dec 28 '17 at 02:15
  • Yes log of lognormal is normal. I skipped some of the derivation in the answer because after taking the log, the derivation is similar to KL-divergence between two Gaussian. – nomadr Dec 30 '17 at 01:17
  • Is there some crazy coincidence here, or is KL divergence transformation invariant in some interesting way? – John Madden Aug 07 '22 at 23:09