1

Let $p$ and $q$ be two distributions on a variable $X$. Let $\widetilde{p}$ and $\widetilde{q}$ be the corresponding distributions on $f(X)$, where $f$ is a strictly monotonic function (e.g. $f(x)=e^x$).

Then is it the case that $D_{KL}(p \Vert q)=D_{KL}(\widetilde{p} \Vert \widetilde{q})$? In other word, is the KL-Divergence invariant to strictly monotonic transformations of the variable $X$?

I can see this easily in the discrete case, and this answer verifies it for the log-normal distribution case, but I was wondering if it held in general for a continuous (and even multivariate) $X$.

chausies
  • 421
  • 1
    Well, just do the transformation, use the Jacobian rule, and see what happens. Coming from this from the point of Shannon entropy, this is why the naive generalisation of discrete Shannon entropy didn’t work, and why in the continuous case you need to consider two distributions – innisfree May 03 '20 at 09:35

0 Answers0