6

Entropy is defined as $H$ = $- \int_\chi p(x)$ $\log$ $p(x)$ $dx$

The Cauchy Distribution is defined as

$f(x)$ = $\frac{\gamma}{\pi}$ $\frac{1}{\gamma^2 + x^2} $

I kindly ask to show the steps to calculate the Entropy of a Cauchy distribution, which is

$$\log(4 \pi \gamma)$$

Reference: Cauchy distribution

garej
  • 359
  • 4
  • 18
RF_LSE
  • 63
  • 5

2 Answers2

10

As shown at How does entropy depend on location and scale?, the integral is easily reduced (via an appropriate change of variable) to the case $\gamma=1$, for which

$$H = \int_{-\infty}^{\infty} \frac{\log(1+x^2)}{1+x^2}\,dx.$$

Letting $x=\tan(\theta)$ implies $dx = \sec^2(\theta)d\theta$ whence, since $1+\tan^2(\theta) = 1/\cos^2(\theta)$,

$$H = -2\int_{-\pi/2}^{\pi/2} \log(\cos(\theta))d\theta = -4\int_{0}^{\pi/2} \log(\cos(\theta))d\theta .$$

There is an elementary way to compute this integral. Write $I= \int_{0}^{\pi/2} \log(\cos(\theta))d\theta$. Because $\cos$ on this interval $[0, \pi/2]$ is just the reflection of $\sin$, it is also the case that $I= \int_{0}^{\pi/2} \log(\sin(\theta))d\theta.$ Add the integrands:

$$\log\cos(\theta) + \log\sin(\theta) = \log(\cos(\theta)\sin(\theta)) = \log(\sin(2\theta)/2) = \log\sin(2\theta) - \log(2).$$

Therefore

$$2I = \int_0^{\pi/2} \left(\log\sin(2\theta) - \log(2)\right)d\theta =-\frac{\pi}{2} \log(2) + \int_0^{\pi/2} \log\sin(2\theta) d\theta.$$

Changing variables to $t=2\theta$ in the integral shows that

$$\int_0^{\pi/2} \log\sin(2\theta) d\theta = \frac{1}{2}\int_0^{\pi} \log\sin(t) dt = \frac{1}{2}\left(\int_0^{\pi/2} + \int_{\pi/2}^\pi\right)\log\sin(t)dt \\= \frac{1}{2}(I+I) = I$$

because $\sin$ on the interval $[\pi/2,\pi]$ merely retraces the values it attained on the interval $[0,\pi/2]$. Consequently $2I = -\frac{\pi}{2} \log(2) + I,$ giving the solution $I = -\frac{\pi}{2} \log(2)$. We conclude that

$$H = -4I = 2\pi\log(2).$$


An alternative approach factors $1+x^2 = (1 + ix)(1-ix)$ to re-express the integrand as

$$\frac{\log(1+x^2)}{1+x^2} = \frac{1}{2}\left(\frac{i}{x-i} + \frac{i}{x+i}\right)\log(1+ix) + \frac{1}{2}\left(\frac{i}{x-i} + \frac{i}{x+i}\right)\log(1-ix)$$

The integral of the first term on the right can be expressed as the limiting value as $R\to\infty$ of a contour integral from $-R$ to $+R$ followed by tracing the lower semi-circle of radius $R$ back to $-R.$ For $R\gt 1$ the interior of the region bounded by this path clearly has a single pole only at $x=-i$ where the residue is

$$\operatorname{Res}_{x=-i}\left(\left(\frac{i}{x-i} + \frac{i}{x+i}\right)\log(1+ix)\right) = i\left.\log(1 + ix)\right|_{x=-i} = i\log(2),$$

whence (because this is a negatively oriented path) the Residue Theorem says

$$\oint \left(\frac{1}{1+ix} + \frac{1}{1-ix}\right)\log(1+ix) \mathrm{d}x = -2\pi i (i\log(2)) = 2\pi\log(2).$$

Because the integrand on the circle is $o(\log(R)/R)$ which grows vanishingly small as $R\to\infty,$ in the limit we obtain

$$\int_{-\infty}^\infty \frac{1}{2}\left(\frac{1}{1+ix} + \frac{1}{1-ix}\right)\log(1+ix) \mathrm{d}x = \pi\log(2).$$

The second term of the integrand is equal to the first (use the substitution $x\to -x$), whence $H=2(\pi\log(2)) = 2\pi\log(2),$ just as before.

whuber
  • 322,774
  • your calculations are just fine, but can you, please, explain, why do you ignore $\pi$ (scale parameter) in the definition of the Cauchy PDF? All books give the result of differential entropy as $\log(4 \pi)$. – garej Jun 30 '19 at 04:10
  • please, see the link as an example (p. 67). – garej Jun 30 '19 at 04:11
  • 1
    @garej Please see the first line of the answer. – whuber Jun 30 '19 at 16:49
  • 1
    @garej Because this issue is more general and applies to any continuous distribution, not just the Cauchy, I have posted an explanation and edited my answer to link to the explanation. – whuber Jun 30 '19 at 18:57
  • Great derivation, but I'm still missing two $\pi$s. For $\gamma=1$ doesn't the density function become $1/[\pi,(1+x^2)]$? In the very first equation I'd expect a $+\log(\pi)$ in the numerator and a $\mathrm{d}x/\pi$ differential. – pglpm Oct 02 '23 at 16:48
  • @pglpm Please see the first line of this post: the pi's were accommodated in that step. – whuber Oct 02 '23 at 17:57
  • I read the first line and also the link, and agree with the conclusions there. But I don't see how it leads to the result of this answer. What I mean: if we start with the density function $\frac{\gamma}{\pi}\frac{1}{\gamma^2+x^2}$ for $x$, then I can't see any transformation $x\mapsto y$ such that the resulting density for $y$ has no $\gamma$s and no $\pi$s, as the integrand at the very beginning. A rescaling $y=x/\gamma$ can eliminate all $\gamma$s, but that still leaves a multiplicative $\pi$... – pglpm Oct 02 '23 at 22:03
  • ...I don't think the $\pi$ can be eliminated altogether, as in your first integrand, by rescalings and shifts. But maybe I'm wrong and am missing something? – pglpm Oct 02 '23 at 22:04
  • @pglpm They don't get eliminated: the first normalization tells you how the pi's enter into the answer. – whuber Oct 02 '23 at 22:04
  • Sorry I really don't see it. Are you "eliminating" (or taking as understood) an additive term given by $\frac{\log(\pi)}{\pi},\int\frac{1}{1+x^2},\mathrm{d}x$? – pglpm Oct 02 '23 at 22:14
  • I think I don't understand what the red $\sigma$ in your linked answer would correspond to here. Sorry for the many comments. – pglpm Oct 02 '23 at 22:17
  • I think I'm unclear, let me try to explain differently. I understand how you can carry away a $\pi$ within an additive logarithm, as explained in the linked answer. My claim is that a $\pi$ still remains in the integral. and it ends up cancelling out with the one you find in your answer here, which then gives $2,\log(2)\equiv \log(4)$. Together with the $\log(\pi)$ that was carried away it gives the right answer. Otherwise I don't see how an additive term with non-power $\pi$s in the logarithm can remove the multiplicative $\pi$ you find above... – pglpm Oct 02 '23 at 22:34
3

This is not a full-scaled answer but just a modest extension of @whuber's answer.

If we take that $\gamma = 1$, so pdf of Cauchy distribution boils down to the following:

$$ p(x) = \frac {1} {\pi (1 + x^2)},$$

where $\pi$ is just a scaling factor (see the picture).

enter image description here

So, if we define differential entropy for valid pdf, its formula is as follows:

$$H = \int_{-\infty}^{\infty} \frac{\log(\pi (1+x^2))}{\pi(1+x^2)}\,dx $$

We can expand it a bit for analysis:

$$ H = \frac{\log(\pi)}{\pi} \int_{-\infty}^{\infty} \frac{1}{1+x^2}\,dx + \frac{1} {\pi} \int_{-\infty}^{\infty} \frac{\log(1+x^2)}{1+x^2}\,dx = $$ $$ = \frac{\log(\pi)}{\pi} H_1 + \frac{1} {\pi} H_2 $$

Step 1 It can be shown that $H_1 = \pi $ because anti-derivative of the integrand is $\arctan(x)$.

Step 2 The part $H_2$ is in detail elaborated in the accepted answer, where we can put $2$ inside $\log$.

$$H_2 = 2\pi\log(2) = \pi \log(4).$$

Step 3 Now we may combine everything to get:

$$ H = \frac{\log(\pi)}{\pi} H_1 + \frac{1} {\pi} H_2 = \frac{\log(\pi)}{\pi} \pi + \frac{1} {\pi} \pi \log(4) = \log(\pi) + \log(4). $$

Conclusion: Now we get the expected result.

$$ H(\gamma = 1) = \log(4\pi).$$

garej
  • 359
  • 4
  • 18
  • 1
    This is the full correct reasoning. $H_2$ is not a differential entropy because $1/(1+x^2)$ is not a probability density - it isn't normalized. – pglpm Oct 04 '23 at 05:10