3

I want to calculate KL Divergence between a normal and an exponential r.v. i.e. $$D(P||Q) = ?\\ \;\; P=N(\mu,\sigma), \;\; Q=exp(\lambda)$$ My problem is that in this case the domains of the distributions are different - the domain of $P$ is $x\in R$ and the domain of Q is $x \in [0,\infty )$. Which domain should I integrate over? If this is the domain of $P$ the value of $\log(Q(x)/P(x))$ is not defined.

Let's say we use want to calculate the KL Divergence for $\mu = 1, \sigma = 2 ,\lambda =1$ what will be the result? I can calculate $D(Q||P)$ but it is not the same.

1 Answers1

4

In this case the KL-divergence $D(P||Q)$ is indeed infinity. To have a well defined (not infinity) KL-divergence, we need support$(P)\subseteq$support$(Q)$. Here "support" means whereever the probability is non-zero. See discussion here (page 3).

Sometimes researchers choose to use the earth-mover distance / sinkhorn distance to avoid such problems.

user12075
  • 283
  • 1
    Note that this implies that if we calculated $D(Q||P)$ we would not have a problem, although how meaningful the result would be is open to question! – jbowman Sep 29 '18 at 18:45
  • If support(P) = support(Q) we can still have infinity KL-divergence, see here – Rodvi May 22 '20 at 14:53