Could someone please give an example of when the likelihood function does not sum up, or integrate to $1$? I have seen this question with the first answer but it really confused me - why are we integrating the likelihood function and not summing it up with the $\sum$ sign? Since it is a Bernoulli distribution (discrete). If someone could clarify my confusion and maybe provide more example(s), that would help I think.
Asked
Active
Viewed 1,056 times
2
-
3probability is always scaled to 1. likelihood doesn't need to be scaled. so you can create an infinite number of examples by taking a probability and dropping the scale – Aksakal Jan 31 '20 at 16:23
-
yes, I shouldnt have included the "pdf integrating to 1" part as it always does. But an example of the likelihood not scaling to 1 would be nice – Jan 31 '20 at 16:26
-
4Almost any likelihood you choose won't "scale to 1", as the variable(s) is/are the parameters, not the random variable of the probability function (which is scaled to integrate to 1.) There is literally no reason why a likelihood would integrate to 1 except by chance. – jbowman Jan 31 '20 at 16:54
-
well then could you give an example? Also, i dont understand in the question referenced, why do we integrate it, not sum it up? – Jan 31 '20 at 16:55
-
I am uncertain if I should mention something like Jeffreys prior or Haldane's prior just as an example of "obviously funny" likelihoods... – usεr11852 Jan 31 '20 at 17:48
-
1"Integrate to one" implies using a specific measure on the parameter space. Likelihood functions are not naturally endowed with such a measure. The question is thus groundless. – Xi'an Jan 31 '20 at 17:53
-
2To add to the fundamental remark voiced by @Xi'an, note that the likelihood is only defined up to a multiplicative constant in the first place and, depending on the density one arbitrarily chooses for integrating it, it might be (and often is) impossible to normalize it to integrate to unity as the axioms of probability demand. – whuber Jan 31 '20 at 18:02
-
See here – Glen_b Feb 01 '20 at 06:41
1 Answers
1
The likelihood in the linked question is $$L(\theta)=P(X=x|\theta)=\theta^x(1-\theta)^{1-x}$$ Here $X$ is a Bernoulli RV, and the likelihood function is actually a probability mass function in terms of $x$. So, you can sum wrt $x$ and obtain $1$. However, likelihoods are typically for observing/estimating the parameter when the data is given, i.e. $x$'s. So, the variable in likelihood function is the parameter, i.e. $\theta$. And, since it is continuous, the answer integrates it with respect to $\theta$ and shows that it doesn't have to be $1$.
gunes
- 57,205
-
"and since it's continuous" ? The likelihood is continuous? But where is it said that it's continuous? Maybe i'm missing something from the defintion – Jan 31 '20 at 17:45
-
2The $\theta$ is a continuous variable which can be any real value between $0$ and $1$ if not stated otherwise. – gunes Jan 31 '20 at 17:46
-
-
3A more fundamental issue: Why integrating wrt the Lebesgue measure on $(0,1)$ and not 12.46 times the Lebesgue measure on $(0,1)$? or $\exp(-5\pi)$ times the Lebesgue measure on $(0,1)$? – Xi'an Jan 31 '20 at 17:56