6

To be clear, I'm not looking for the Jeffreys prior on parameters of Gaussian, log-normal, or exponential distributions.

I am, instead, looking for a probability distribution, which has one or several parameters, and for which the derivation of the Jeffreys prior returns a distribution that is Gaussian, log-normal, or exponential.

Arthur
  • 448
  • 1
    The question about whether the prior is Gaussian or exponential can also be written as follows: Let $f$ abbreviate $f(x,\theta)$ where $\theta$ is the parameter and $x$ is the variable of interest. Let $$DDL[\cdot]=\frac{\partial ^2 \log[\cdot]}{\partial \theta^2}$$ Then the Jeffreys prior is Gaussian iff $$DDL!\left[\int -f , DDL[f] , dx\right]$$ is a non-zero constant, and the Jeffreys prior is exponential iff that expression is zero. (In this formulation we don’t need to worry about the square root because $\log\sqrt{x}=\frac12\log x$.) – Matt F. Mar 09 '24 at 11:27
  • Does the question refer only to the "straight" Jeffery's prior $p(\theta)$ being normal/exp/logN (although it is often an improper prior) or does it include the posterior $p(\theta|x)$? – Spätzle Mar 10 '24 at 12:03

2 Answers2

4

The Jeffreys prior transforms like a density when we reparameterize a parameter. So say that we have some distribution and parameterisation for which the Jeffreys prior is a proper distribution, then we can transform the Jeffreys prior with a reparameterisation.

This gives a method to generate many different cases where the Jeffreys prior is a normal, lognormal or exponential distribution.


Example: if we have a Bernoulli distribution with parameter $p$ as the probability for one of the outcomes. Then the Jeffreys prior is an arcsine distribution.

$$F(p) = \frac{2}{\pi} \text{arcsin}(\sqrt{p})$$

If we put this equal to another distribution function then we get the transformation. For example let $\Phi(\theta)$ be the distribution function of a standard normal distribution. Then $\Phi(\theta) = \frac{2}{\pi} \text{arcsin}(\sqrt{p})$ and as transformation we can use

$$\begin{array}{rcl} \theta & = & \Phi^{-1}\left( \frac{2}{\pi} \text{arcsin}(\sqrt{p})\right)\\ p &=& \sin \left( \frac{\pi}{2}\Phi(\theta) \right)^2 \end{array} $$

and a mass distribution function like

$$f(x|\theta) = \begin{cases} 1-\sin \left( \frac{\pi}{2}\Phi(\theta) \right)^2 & \quad \text{if } x= 0 \\ \sin \left( \frac{\pi}{2}\Phi(\theta) \right)^2 & \quad \text{if } x= 1 \\ 0 & \quad \text{else} \end{cases}$$

has a standard normal distribution as the Jeffreys prior for the parameter $\theta$.

  • This looks correct, but the parameter transformation is unsatisfyingly awkward. – Matt F. Mar 13 '24 at 18:19
  • @MattF. I believe that any case will be awkward. To get the Jeffreys prior become a proper prior the distribution needs to have a limited range of possibilities. If also in addition we need to have the information matrix follow a normal distribution then you get a distribution whose change of information / cross entropy relates on some weird dependency of the distribution on the parameter. – Sextus Empiricus Mar 14 '24 at 06:22
  • Depending on your taste in formulas, It may get a little less awkward if you write it as $$f(x|\theta)=\frac12\pm\frac12\cos(\pi\Phi(\theta))$$ with $+$ at $x=0$ and $-$ at $x=1$. – Matt F. Mar 14 '24 at 11:03
3

Based on this answer, we can claim that for Poisson distribution with parameter $\lambda$ the Jeffrey's prior of $\theta=\log\left(\frac{1}{\lambda}\right)$ is exponential. Let's see exactly how.

For $x\sim\text{Poisson}(\lambda)$ with $\lambda=\exp(-\theta)$ we have \begin{align} f(x_i|\lambda)&=\frac{\lambda^{x_i}\exp(-\lambda)}{x_i!}\\ f(x_i|\theta)&=\frac{\exp(-\theta x_i)\exp(-\exp(-\theta))}{x_i!}\\&=\frac{\exp(-(\theta x_i+\exp(-\theta))}{x_i!} \end{align}

The likelihood is then

$$L(\theta)=\frac{1}{\prod_{i=1}^n{(x_i!)}}\exp\left(-\left(\theta\sum_ix_i+n\cdot\exp(-\theta)\right)\right)$$

log-likelihood:

$$l(\theta)=\log\left(\frac{1}{\prod_{i=1}^n{(x_i!)}}\right)-\theta\sum_ix_i-n\cdot\exp(-\theta)$$

first derivative:

$$\frac{\partial}{\partial\theta}l=-\sum_ix_i+n\cdot\exp(-\theta)$$

second derivative:

$$\frac{\partial^2}{\partial\theta^2}l=-n \exp(-\theta)$$

Fisher information:

$$I(\theta)=-E\left[ \frac{\partial^2}{\partial\theta^2}l \right]=n\exp(-\theta)$$

So we can write

$$\sqrt{\left|I(\theta)\right|}=\sqrt{n}\exp\left(-\frac{\theta}{2}\right)$$

and finally the Jeffrey's prior $\pi^J\propto\exp\left(-\frac{\theta}{2}\right)$ has exponential distribution with parameter 0.5.

Matt F.
  • 4,726
Spätzle
  • 3,870